With all the fuss over human rights legislation there seems to be a trick missing.
In his Robot books Isaac Asimov postulated 3 laws of robotics:
1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
Eventually in another novel (The Caves of Steel) he realised that there should be a zeroth law:
0. A robot may not injure humanity, or, through inaction, allow humanity to come to harm.
A condition stating that the Zeroth Law must not be broken was added to the original Laws.
(reference - http://en.wikipedia.org/wiki/Three_Laws_of_Robotics)
So, is it simply not the case that any human rights actions should take on board a simple zeroth Humanity Rights law as well?