For those who believe in some sort of categorical imperative or divine command system of morality, there are certain things that everyone should do, regardless of circumstances. Many religious rituals and/or beleifs fall into this category, as do things like the "golden rule" or total pacifism. In these types of systems, a moral act is moral for everyone. You, me, Bob, Jill. If it's good for one of us to donate to charity, then it's good for all of us to donate to charity.
Then you have various forms of consequentialism, where whether or not an action is moral is based on what happens (or will probably happen) as a result. In these types of moral systems, it is entirely possible to have different people correctly doing different things. Often times this is explained by different circumstances. For instance, killing in self defense is okay while killing for money is not. So given two people in different circumstances, it could be correct for one to kill but incorrect for the other.
And yet, even in the evaluation of most consequentialists, two people in identical circumstances (perhaps including identical mental states) ought to take identical actions. If it's okay for me to kill given the "self-defense" circumstance, then it's okay for you to kill given the "self-defense" circumstance. Because of this, there's still a sense in which a moral act is moral for everyone, once you've taken the circumstances into consideration. If action X is the correct move in circumstance Y, then it's correct for everyone.
Most forms of consequentialism have learned to incorporate some sort of game theory, and work with expected payoffs. This is necessary to prevent luck from having a significant impact on morality. If I perform an act that 99% of the time leads to a great benefit for humanity, then the 1% of the people who perform the act but fail to get the benefit should not be deemed immoral just because they were unlucky. But if you're going to steal some of game theory, you should at least steal all of it. And this includes the concept of a mixed strategy.
In some games, the correct decision is not "do X," but instead "do X some fraction of the time, and do Y otherwise." This type of situation, where the maximum expected payoff occurs from randomizing (to some extent) your actions, is called a mixed strategy. So if we're seriously going to try and maximize expected payoff, we need to realize that this sometimes means varying our actions. And if we expect large numbers of people to use this system, we should thus expect people to act differently in identical situations and still be acting morally.
Once you realize this, it's actually quite trivial to construct examples that illustrate the point. And the examples aren't even all that far fetched. For instance, let's say a man in a wheelchair needs to gain access to a building that lacks automatic doors. Is it good to help this man by opening a door for him? Most people would say that yes, this is a good thing to do. But should everyone do it? If there are five people nearby, should all five of them try to open the door for this man?
You only need one person to open the door. It's overkill, a waste of resources, to have all five people open the door. It is clearly better for one of the five people to open the door than for all of them to try the same. And this holds even if we make the situations completely identical. Let's say each person is equidistant from the door, each is in equally good physical condition, has exactly the same types of time constraints, etc. It's still best for just one of these people to open the door, and not for all five to try and do so. Even given identical situations, we should expect the five people to act differently while still acting correctly.
So while "do what you'd want everyone to do" is often a decent principle, a better one is "do what makes the world a place where you'd want to live." I would not want to live in a world where everyone became a doctor. I recognize that taking care of the sick is a very good thing to do, but there's other stuff that needs to get done as well. So while we need doctors, and may even need more doctors, Doctortopia would be a terrible place to live (it would have a huge unemployment rate, for one thing). It's not enough to recognize that we need doctors. You have to try and figure out how many doctors we need.
And yet, at the same time, beware the bystander effect. While it is true that only one of the five people needs to open the door for the guy in the wheelchari, it still needs to get done. While only one person helping is better than five, five people helping is better than zero. The point isn't to alleviate oneself of responsibility. The point is to recognize that morality can't adequately be expressed as a set of actions you're supposed to do. Even if you take into acount the circumstances, a simple list of "If situation X, do action Y" is only going to get you so far. Every now and then, you have to be willing to mix it up.