Since I'm late to the punch for a Pokemon joke, I'll give a serious opinion.
'Normal' can be described as a few things. The best way I can think of to describe it would be to use a math analogy. In a set of numbers, there are certain values-- mean (the average. Found by adding together all the numbers and dividing by how many numbers were added together), median (the middle number in the set when ordered from least to greatest), mode (the number that appears the most often), and the range (the lowest number subtracted from the highest). The most common definitions of 'normal' tend to closely follow those definitions. Some people might say 'normal' is the same as 'average'--a number that best represents a set of numbers: close to the mode, not too close to either extreme as they cancel each other out, and not giving the median too much weight. Basically a person who can blend in and fit in with most crowds.
Another common definition of 'normal' is like the median (a definition I find to be the least accurate). In medicine, someone with a terminal disease is usually given the median number of years people with the disease survive it. Say, three years. However, a mean number of years is almost always more accurate-- say, ten years. Most times, when applied to a group of people, it would be a person who is supposed to be, between 0 and 100, a 50. More often than not, though, they're closer to 65 or 35. I know, I know, bad explanation on that one, but I can't figure out how to get my wording straight with it.
Yet another is the 'mode' definition of normal. It would be to state that whichever group appears most in a given environment. But there's an obvious reason why this is flawed. It's not going to be split up 85-15 between nerds and jocks either way. There are myriads of social groups. So, let's say there are a certain amount of people, in 100 different social groups. Pretend like there are 1000 people total. There could be 10 people in 98 groups, 9 in one, and 11 in one. Just like that, the group with 11 people becomes 'normal' under this definition Obviously not a fair representation.
Finally, there's range. It's found by subtracting the lowest extreme from the highest. In a set of numbers, this can get very awry very fast. Let's say there are five numbers. To use an extreme example-- 1;4;8;11;500,000,000. The range would be 499,999,999. You can see how it's not right. In people, you might have a school full of nerds and one super-athletic person who got in. This would change the 'normal' to somewhere betwixt the two, when obviously it should be the more nerdy personality type that would be normal.
TL;DR: Definitions of normal should change to better adhere to whatever set of data you're presented with.