February 2026
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    232425262728  

    The source

    Ahmed beziz

    Human Nature and the Failure of Ideal Societies

    War, pollution, corruption, inequality, tyranny, exploitation. and more. The world is filled with suffering and injustice. What many people fail to fully recognize, however, is that nearly all of these evils share one fundamental characteristic: they are human-made.

    Most of the pain and suffering endured by humanity is not the result of natural disasters or unavoidable forces, but of deliberate or negligent human actions. For thousands of years, despite immense technological, scientific, and cultural advancement, humanity has remained unable to overcome this destructive pattern of behavior. This raises a deeply unsettling question: why, after all this progress, are we still the same?

    This book argues that the failure lies not in a lack of knowledge, resources, or moral frameworks, but in human nature itself. Religions, political ideologies, and economic systems repeatedly promise the creation of a just, peaceful, and humane society. Yet history demonstrates that every such system eventually collapses into corruption, oppression, or violence.

    This is not a shallow or cynical claim. On the contrary, thousands of psychological studies and social experiments have revealed uncomfortable truths about the human mind. We are far less rational than we believe ourselves to be. Our decisions are driven by cognitive biases, emotional impulses, tribal instincts, and social pressures. We are highly susceptible to manipulation, persuasion, and misinformation.

    These psychological weaknesses are not accidental flaws. They are actively studied, refined, and exploited by governments, corporations, militaries, and other centers of power. Modern systems of control do not rely solely on force or violence, but on influence, shaping beliefs, emotions, fears, and desires at scale.

    This book will examine how these mechanisms operate, why all idealized systems inevitably fail, and how a deeper understanding of human psychology is essential if we are ever to reduce suffering and build a more honest and resilient society.

    I will explore eight weak points in the human mind that can be and are being exploited by smart, powerful entities.

    How you use this information is up to you. I do not take responsibility for bad actors who may use these facts to take advantage of others.

    The purpose of writing this book is to send a message, and I will use these facts to hammer it home.

    In the end, even a spoon can be used to gouge an eye in the wrong hands.

    Beziz ahmed

    Chapter 1

    Commitment and Consistency: The Human Drive for Coherence

    Human beings place great value on internal stability. Across cultures and historical periods, consistency has been associated with reliability, rationality, and moral integrity. To be seen as someone who “stands by their word” is socially rewarded, while inconsistency is often judged as weakness, hypocrisy, or lack of character. This cultural preference is not merely social; it reflects a deep psychological mechanism that shapes decision making, belief formation, and behavior.

    The principle of commitment and consistency describes the tendency for individuals to align future actions with past choices, statements, or behaviors. Once a commitment is made especially if it is voluntary, public, or effortful. people experience pressure to behave in ways that confirm it. This pressure does not usually arise from external enforcement, but from an internal need to preserve a coherent self-image.

    At its core, this phenomenon is driven by discomfort. When actions and beliefs do not align, the individual experiences psychological tension. Rather than revisiting the original decision, people often resolve this tension by adjusting their attitudes or doubling down on previous choices. Consistency, therefore, becomes a shortcut for reducing mental strain.

    Commitments serve an important cognitive function. The world presents an overwhelming number of choices, and constantly reevaluating every decision would be mentally exhausting. By committing to a position, the brain simplifies future decisions. Once a stance is adopted, related choices can be made automatically, without extensive analysis.

    However, this efficiency comes at a cost. When circumstances change or when new information emerges, the desire for consistency can prevent rational reassessment. Individuals may continue supporting ideas, relationships, or systems that no longer serve them, simply because abandoning them would require admitting error.

    Importantly, the strength of commitment increases under specific conditions. Commitments are more powerful when they are:

    * Made freely rather than under force

    * Expressed publicly rather than privately

    * Written rather than spoken

    * Associated with effort, sacrifice, or identity

    These features transform simple actions into psychological anchors.

    One of the most influential demonstrations of this principle came from a field experiment conducted in everyday settings. Researchers approached individuals with a minor request that required little effort and carried no obvious cost. Examples included answering a few questions or agreeing to a small symbolic action.

    Days or weeks later, the same individuals were contacted again, this time with a far more demanding request. The key finding was that those who had agreed to the initial small action were significantly more likely to comply with the larger one than those who had not been previously contacted.

    Crucially, nothing about the first request logically required acceptance of the second. The connection existed only at the level of self-perception. By agreeing once, participants had unconsciously categorized themselves as cooperative, supportive, or civic-minded. Refusing later would have conflicted with this emerging identity.

    This experiment demonstrated that behavior does not merely express attitudes, it actively shapes them. A small action can quietly redefine how a person sees themselves, and future behavior follows that definition.

    A second line of evidence comes from laboratory research on belief change following behavior. In one such experiment, participants were asked to engage in a monotonous task designed to be uninteresting. After completing it, they were instructed to describe the task to another person as enjoyable.

    Some participants received a meaningful reward for doing so, while others received a token amount that offered little justification. When later asked to evaluate the task privately, those who had received minimal reward reported finding it more enjoyable than those who were paid well.

    This counterintuitive result highlights a critical aspect of consistency. When external explanations are weak, the mind searches for internal ones. Since the participants could not easily justify their behavior through compensation, they altered their beliefs to match their actions. The behavior came first; the belief followed.

    This process is not deliberate deception. Participants were not consciously lying to themselves. Rather, belief change occurred as a psychological repair mechanism, restoring alignment between action and self-concept.

    The drive for consistency has profound implications beyond the laboratory. It plays a role in:

    * Persistence in failing projects

    * Loyalty to harmful groups or ideologies

    * Difficulty leaving unhealthy relationships

    * Resistance to corrective evidence

    Once individuals invest time, identity, or public reputation into a position, withdrawal becomes psychologically expensive. The original commitment may fade in importance, but the need to remain consistent grows stronger with each reaffirmation.

    This mechanism also explains why early choices carry disproportionate influence. Initial commitments often occur with limited information, yet they can set trajectories that are difficult to reverse. Over time, rationalization accumulates, and the original decision becomes embedded in identity.

    Commitment and consistency are not flaws in themselves. They enable stability, trust, and coordinated social life. Without them, long-term goals and relationships would be difficult to sustain. However, when consistency becomes an end in itself, it can trap individuals in patterns that no longer reflect reality or personal well-being.

    Understanding this principle reveals an uncomfortable truth: people are often less guided by present evidence than by past behavior. The mind prefers coherence over accuracy, and once a line has been drawn, it resists erasing it.

    Recognizing this tendency is the first step toward regaining flexibility. the ability to revise beliefs without experiencing collapse of identity. In a world that constantly changes, the capacity to update oneself may be more valuable than consistency itself.

    Chapter 2

    The Halo Effect: How First Impressions Reshape Reality

    Human judgment rarely begins from a neutral position. When encountering another person, the mind rapidly forms an overall impression, often within seconds. This initial evaluation then becomes a lens through which all subsequent information is interpreted. The halo effect describes this process: a single noticeable characteristic influences the way unrelated traits are perceived, evaluated, and remembered.

    Rather than carefully assessing each attribute independently, the brain tends to generalize. If someone appears competent, attractive, confident, or high-status, observers often assume they are also intelligent, ethical, and capable in other areas. This shortcut reduces cognitive effort, but it comes at the cost of accuracy.

    From an evolutionary perspective, rapid judgments had survival value. Early humans benefited from quickly deciding whether another individual was a potential ally or threat. Over time, this adaptive mechanism became embedded in social cognition. In modern environments, however, the same mechanism operates in contexts where precision matters such as education, employment, justice, and leadership.

    The halo effect thrives under conditions of limited information. When data is incomplete, the brain fills gaps by extending known traits into unknown ones. This process often happens automatically and feels intuitive, making it difficult to detect or correct.

    One of the earliest and most robust demonstrations of the halo effect involved physical attractiveness. In controlled experiments, participants were shown images of individuals they had never met. They were then asked to make judgments about personality, intelligence, moral character, and future success.

    Across multiple trials, attractive individuals were consistently rated more favorably on nearly all dimensions. Participants assumed they were more competent, more socially skilled, and more likely to lead successful lives. These conclusions were drawn without any behavioral evidence.

    What makes this finding especially significant is that participants often denied being influenced by appearance. Even when explicitly instructed to ignore physical traits, the bias persisted. This suggests that the halo effect operates below conscious awareness and is resistant to simple correction.

    The halo effect extends far beyond personal attraction. In organizational settings, it can distort performance assessments, promotions, and leadership selection. Early research on workplace evaluations revealed that when supervisors formed a positive impression of an employee in one area such as punctuality or communication that impression spread to unrelated traits like reliability, creativity, or leadership potential.

    This pattern occurred even when evaluators were experienced and believed themselves to be objective. Rather than judging each category independently, they unconsciously allowed one standout quality to dominate their overall assessment.

    The reverse effect also occurred. A single negative trait could create a “horn effect,” where poor performance in one area contaminated evaluations across the board. Once a negative impression formed, positive behaviors were discounted or overlooked.

    The halo effect does not merely shape perception; it can actively influence outcomes. When authority figures expect higher performance from certain individuals, they often behave differently toward them providing more attention, encouragement, and opportunities. These subtle differences can lead to improved performance, reinforcing the original impression.

    In this way, the halo effect becomes self-confirming. The belief shapes behavior, which then produces evidence that appears to validate the belief. What began as an assumption turns into a reality created by differential treatment.

    This dynamic is particularly influential in educational environments. Students perceived as gifted or motivated often receive more support, while those labeled as weak or difficult may face lower expectations. Over time, these expectations can shape academic trajectories, independent of original ability.

    The halo effect also plays a role in moral reasoning. When someone is admired or respected, their questionable actions are more likely to be excused or reinterpreted. Conversely, disliked individuals are judged more harshly for similar behavior. Moral evaluation becomes linked to identity rather than action.

    This explains why public figures with positive reputations often retain support despite serious misconduct, while others are condemned for minor infractions. The initial global impression acts as a filter that determines how evidence is weighed.

    The difficulty in countering the halo effect lies in its efficiency. It feels natural, effortless, and often accurate enough to go unquestioned. Deliberate correction requires slowing down judgment, separating traits, and actively seeking disconfirming evidence processes that demand cognitive effort.

    Research suggests that structured evaluation systems, blind assessments, and delayed judgment can reduce halo-based bias. However, even these measures do not eliminate it entirely. The human mind remains inclined to favor coherent stories over fragmented truths.

    The halo effect reveals a fundamental limitation in human judgment: the tendency to confuse surface signals with deeper qualities. While this bias simplifies social interaction, it distorts fairness, accuracy, and accountability.

    Understanding the halo effect does not mean abandoning first impressions entirely, but recognizing their influence and limits. In complex social systems, the ability to separate appearance from substance is not intuitive. it is a skill that must be consciously developed.

    Chapter 3

    Social Proof: When the Crowd Becomes the Compass

    Human beings are inherently social. From early childhood, individuals learn not only through direct experience, but by observing the actions and reactions of others. When situations are ambiguous or unfamiliar, people often rely on collective behavior as a guide. Social proof refers to the tendency to interpret the choices of others as evidence of what is correct, acceptable, or safe.

    This mechanism is especially powerful in moments of uncertainty. When the right course of action is unclear, the behavior of the group offers a shortcut. If many people appear to agree, the mind assumes that they collectively know something the individual does not.

    From an evolutionary standpoint, following the group often increased survival. If many members of a group avoided a certain area or adopted a specific behavior, it was likely for a reason. Over time, this reliance on group behavior became deeply ingrained in human psychology.

    In modern contexts, however, this same tendency can produce irrational outcomes. Groups can be misinformed, biased, or driven by fear. Yet the psychological weight of consensus remains strong, even when evidence contradicts it.

    One of the clearest demonstrations of social proof occurred in a laboratory setting involving a straightforward visual task. Participants were asked to compare line lengths an activity with an objectively correct answer. Unknown to the real participant, the other individuals in the room were instructed to give incorrect responses.

    Despite the simplicity of the task, a substantial number of participants went along with the group’s wrong answer at least once. Some conformed to avoid social discomfort, while others doubted their own perception. The presence of unanimous agreement from others created pressure strong enough to override direct sensory input.

    This experiment revealed that social proof does not require complex reasoning or emotional involvement. Even in neutral situations, the opinions of others can reshape perception itself.

    Social proof operates through two overlapping forces. Informational influence occurs when individuals assume the group possesses better knowledge. Normative influence arises from the desire to be accepted and avoid social rejection. Often, both forces are present simultaneously.

    In ambiguous scenarios, informational influence dominates. People follow the crowd because they believe it reflects reality. In clear situations, normative influence becomes more important. Individuals may recognize the group is wrong, yet conform to avoid standing out.

    The interaction between these two pressures explains why social proof can be powerful even among confident, intelligent individuals.

    Social proof extends beyond factual decisions into emotional evaluation. Research has shown that people’s emotional reactions can be shaped by how others respond to the same stimulus. When individuals believe that others found something amusing, impressive, or disturbing, they are more likely to report similar reactions.

    This effect suggests that emotions are not purely internal experiences. They are, in part, socially calibrated. The group provides cues about how one is “supposed” to feel.

    In environments such as theaters, rallies, or online platforms, emotional synchronization can spread rapidly, amplifying collective reactions.

    Social proof can also suppress action. When people observe others remaining passive in a situation that might require intervention, they often interpret inaction as a signal that action is unnecessary. The absence of response becomes its own form of information.

    This dynamic is particularly dangerous in emergencies, where hesitation can reinforce itself. Each individual waits for confirmation from others, while others are doing the same.

    In digital environments, social proof is magnified. Metrics such as likes, shares, and follower counts act as visible indicators of collective approval. These signals shape perception of credibility, value, and importtance often before content is even evaluated.

    Because these cues are immediate and quantitative, they carry disproportionate weight. Popularity becomes mistaken for accuracy, and repetition becomes mistaken for truth.

    Social proof reveals how deeply human judgment is intertwined with collective behavior. While following others can be adaptive, it can also lead individuals away from evidence and personal judgment. The crowd does not merely influence decisions it can redefine reality itself.

    Understanding social proof requires recognizing that independence of thought is not the default state. It is an effortful process that must be actively maintained, especially in environments designed to highlight consensus.

    Chapter 4

    The Mere Exposure Effect: Familiarity as a Source of Preference

    People often believe their likes and dislikes are the result of careful evaluation. In reality, preference can emerge without deliberate thought, reasoning, or even awareness. The mere exposure effect describes the tendency for repeated contact with a stimulus to increase positive feelings toward it. Simply encountering something again and again can make it feel safer, more pleasant, and more acceptable.

    This effect operates even when the stimulus carries no obvious benefit and even when individuals cannot consciously remember having seen it before. Familiarity itself becomes a signal the brain interprets as positive.

    At a basic level, the mind treats the familiar as less threatening. Throughout human history, unfamiliar objects, people, or environments often carried risk. Recognizing something known signaled survival. Over time, this association between familiarity and safety became embedded in emotional processing.

    When the brain repeatedly encounters a stimulus without negative consequences, it begins to lower defensive responses. This reduction in mental friction is experienced subjectively as comfort or liking. Importantly, this process does not require reflection or approval. it happens automatically.

    One of the most striking aspects of the mere exposure effect is that it does not depend on conscious recognition. In experimental settings, participants have been exposed to unfamiliar symbols, sounds, or images for very brief intervals. Later, when asked to rate these items, they consistently favored those they had encountered more often.

    Many participants were unable to identify which items they had previously seen. Yet their preferences reflected exposure history with remarkable accuracy. This demonstrated that affective judgments can form independently of explicit memory.

    The mere exposure effect works gradually. Each exposure produces a small shift in evaluation, often too subtle to notice. Over time, however, these small changes accumulate. What initially felt neutral may come to feel positive, and what felt slightly uncomfortable may begin to feel normal.

    This explains why first impressions are not fixed. Repeated contact can soften negative reactions and strengthen positive ones, even in the absence of meaningful interaction.

    Beyond laboratory settings, the effect has been observed in natural social contexts. In educational environments, individuals who were seen frequently but not necessarily engaged with were rated more favorably than those encountered less often. Visibility alone was enough to influence evaluation.

    These findings suggest that liking does not always arise from compatibility or shared values. Sometimes it emerges simply from repeated presence within the same environment.

    The mere exposure effect has boundaries. Excessive repetition can lead to boredom or irritation, particularly when exposure becomes intrusive or unavoidable. Additionally, if early exposures are paired with negative experiences, familiarity may amplify dislike rather than reduce it.

    Context also matters. When a stimulus is associated with threat or harm, repetition does not produce comfort. The effect depends on neutral or mildly positive conditions.

    People rarely recognize familiarity as the source of their liking. Instead, they construct explanations that feel more reasonable such as shared traits, quality, or personal relevance. This misattribution reinforces the belief that preferences are deliberate and justified.

    Because the true cause remains hidden, the influence of exposure is difficult to challenge. Individuals defend their preferences without realizing how little evaluation was involved.

    The mere exposure effect shapes cultural norms, tastes, and social acceptance. Over time, repeated visibility can normalize ideas, behaviors, or groups that initially seemed strange. Conversely, lack of exposure can preserve suspicion or discomfort.

    This mechanism plays a quiet but powerful role in shaping what feels “natural” or “right,” often without explicit endorsement.

    The mere exposure effect reveals that preference is not always the product of choice or reasoning. Familiarity alone can tilt emotional evaluation, slowly and silently. What feels personally meaningful may, in fact, be the result of repeated contact.

    Recognizing this tendency challenges the assumption that liking equals judgment. Sometimes, it is simply recognition wearing the mask of preference.

    Chapter 5

    False Memory: When the Past Is Rewritten

    Memory is often treated as evidence. People rely on it to define identity, justify decisions, and determine responsibility. Yet psychological research has consistently shown that memory is not a precise record of past events. Instead, it is a flexible and reconstructive process, shaped by expectation, suggestion, and later information. False memory refers to the formation of recollections that feel real and detailed but do not correspond to actual experiences.

    These memories are not lies in the usual sense. Individuals who hold them are typically sincere and confident. The error lies not in intention, but in the way the mind rebuilds the past each time it is accessed.

    When an event occurs, the brain does not store it as a complete recording. Instead, fragments are encoded sensations, emotions, and general meaning. During recall, these fragments are assembled into a coherent narrative. Gaps are filled using assumptions, prior knowledge, and social cues.

    Each act of remembering is therefore an act of reconstruction. Once reconstructed, the memory is stored again in its altered form. Over time, repeated recall can gradually shift the content of the memory further away from the original event.

    One line of experimental research demonstrated how easily suggestion can introduce entirely new memories. Participants were presented with descriptions of childhood events, most of which were true. Alongside these, researchers included one fabricated event. Participants were encouraged to reflect on all of them and describe what they remembered.

    Over multiple interviews, a significant number of participants began to report detailed recollections of the invented event. They described emotions, surroundings, and sequences of actions, despite the event never having occurred. Confidence in these memories often increased with repetition.

    This finding revealed that imagination, when combined with suggestion and social authority, can generate memories that feel authentic.

    Another powerful demonstration of false memory comes from experiments involving lists of related concepts. Participants were exposed to groups of words connected by a common theme. Later, during recall, they frequently reported remembering a central concept that had not actually been presented.

    The brain prioritized overall meaning rather than precise detail. Because the missing word fit the theme, it was inserted into memory as if it had been present. This shows that memory favors coherence over accuracy.

    Such errors are not random. They follow predictable patterns based on association and expectation.

    False memories often carry strong emotional content. Emotion enhances confidence, making memories feel vivid and certain. Ironically, this confidence can make false memories more persuasive than accurate but emotionally neutral ones.

    Because people associate confidence with truth, false memories can influence personal relationships, legal testimony, and self-identity. Once established, they are difficult to challenge without triggering defensiveness.

    Memory is also shaped socially. When memories are discussed within groups, details tend to converge. Over time, shared narratives emerge, and individual recollections adjust to fit them. Discrepancies fade, replaced by consensus.

    This process strengthens group cohesion but reduces historical accuracy. Collective memory becomes less about what happened and more about what is agreed upon.

    The implications of false memory extend beyond personal error. In legal contexts, eyewitness testimony can be influenced by questioning style and external information. In families, shared but inaccurate stories can redefine personal histories. In societies, collective false memories can shape national identity.

    The danger lies not in memory’s fallibility, but in the assumption that it is reliable.

    False memory exposes a fundamental vulnerability in human cognition. The past is not preserved; it is continually rewritten. What feels like recall is often reconstruction guided by belief, expectation, and context.

    Recognizing this does not render memory useless, but it demands humility. Certainty about the past is not proof of truth. It is, at best, evidence of a story the mind has learned to tell itself.

    Chapter 6

    The Authority Effect: Obedience and the Transfer of Responsibility

    Across societies, authority structures exist to maintain order, coordination, and efficiency. Titles, uniforms, credentials, and institutional roles signal legitimacy and competence. From an early age, individuals are taught to respect and follow those in positions of authority. While this system enables large-scale cooperation, it also carries a psychological cost. The authority effect describes the tendency for people to comply with directives from perceived authorities, even when those directives conflict with personal judgment or moral standards.

    This tendency does not require cruelty or ill intent. Ordinary individuals, under the right conditions, may perform actions they would otherwise reject, simply because responsibility feels transferred to someone else.

    Authority functions by narrowing perceived choice. When a directive comes from a legitimate source, the decision is reframed: the individual is no longer choosing whether an action is right or wrong, but whether to follow instructions correctly. Moral evaluation shifts upward in the hierarchy.

    This psychological shift reduces internal conflict. Obedience becomes a way to avoid the burden of independent judgment. The presence of authority provides clarity, even when the action itself is disturbing.

    Experimental research has shown that the environment in which authority operates greatly affects obedience. When commands are issued within respected institutions, compliance increases dramatically. Symbols of legitimacy, formal settings, professional language, and standardized procedures. strengthen the effect.

    Participants in controlled studies have been observed continuing harmful actions when reassured that responsibility lay elsewhere. Even visible signs of distress from others did not consistently interrupt obedience when authority remained firm.

    These findings revealed that obedience is not driven by aggression, but by role compliance and trust in institutional structure.

    Authority is especially powerful when embedded within professional hierarchies. In healthcare, military, and corporate settings, individuals are trained to defer to superiors. While this improves coordination, it can suppress ethical questioning.

    In experimental simulations involving medical professionals, participants were willing to follow unsafe instructions when they believed they came from a legitimate superior. Their training and experience did not protect them; in some cases, it intensified obedience by reinforcing role expectations.

    This demonstrates that expertise does not eliminate authority effects. it can amplify them.

    Obedience often unfolds in small steps. Initial requests may seem harmless, but they establish a pattern of compliance. As demands escalate, individuals feel pressure to remain consistent with prior obedience. Refusal becomes psychologically more difficult over time.

    This gradual escalation prevents abrupt moral alarm. Each step appears only slightly worse than the last, allowing individuals to move far beyond their original limits without noticing the shift.

    Authority can also create emotional distance. When individuals view themselves as instruments executing orders, empathy is reduced. The focus shifts from the impact of the action to the accuracy of execution.

    Language plays a role here. Technical terms and abstract phrasing reduce emotional salience, making harmful actions easier to perform. Authority structures often rely on such language to maintain efficiency.

    Not all individuals obey authority unconditionally. Resistance occurs more frequently when authority is fragmented, when peers model dissent, or when victims are made visible and personal. These factors reintroduce moral evaluation and shared responsibility.

    However, resistance requires psychological effort. The default tendency remains compliance, especially when authority appears unified and legitimate.

    The authority effect reveals that obedience is not a personal flaw, but a structural vulnerability. Systems that demand unquestioning compliance can turn ordinary people into participants in harmful outcomes without requiring malicious intent.

    Understanding this phenomenon shifts the focus from individual blame to institutional design. When authority discourages questioning and concentrates responsibility, moral failure becomes not an exception, but a predictable result.

    My thoughts

    Now that you have read about all the weaknesses the human mind has, I like to think you have a better idea of just how weak and fragile we are mentally.

    If you believe these facts paint a miserable picture of humanity, you should know that this is just a drop in the ocean. There are many flaws within the human mind that I have not mentioned in this book, as these are sufficient to prove my point.

    The unfortunate thing is that most of those who call themselves "intellectuals" study history and the disasters that killed tens of millions of humrans such as World War II, the famine in China, Stalin's massacres in the Soviet Union, and more. and reach the conclusion that they occurred for economic or social reasons. However, the real reason these people refuse to delve into is that our powerful leaders are like all of us: subject to human nature and no better than us in that regard. Their brains are full of flaws and biases. Power and the dysfunctional human brain are a dangerous mixture.

    Most societies currently operate on the "carrot and stick" system. This system might work with animals, but humans are more complex; in most cases, forbidding something only increases the human desire for it. The greater tragedy is that even this backward system is not applied correctly or fairly. How then can we expect to build peaceful, just, and prosperous societies? This is another reason that drives us to study human nature more and focus on understanding and reforming the human mind through effective methods.

    After thousands of years, humans have indeed achieved great progress in many fields. But if you study historical writings about social life in Ancient Rome, for example, you will notice that when it comes to us as humans, we are still mentally backward. We still live within the same class structures and patterns. We are like a hungry child in a room full of noisy toys, ignoring and forgetting his hunger and the screams of his stomach and body, distracted by toys and the shiny jewels hanging on the walls, until his body collapses and hunger kills him.

    We are busy exploring space, the depths of the oceans, and the Earth. Enormous sums of money are squandered on these futile studies. I believe it is more appropriate for us to focus more on studying the human being from a mental perspective, and finding ways to reform these dangerous flaws existing within our brains. I have a firm belief that this is the future of humanity: reforming ourselves mentally and trying to move as far as possible from our animalistic natures. We must stop mowing the grass in the garden. Our house is on fire.

    ACADEMIC REFERENCES

    This book discusses well-known psychological research for educational purposes. All experiments cited are publicly available academic studies. Interpretations are writen by me.

    1. Commitment and Consistency

    Foot-in-the-Door

    Freedman, J. L., & Fraser, S. C. (1966).

    *Compliance without pressure: The foot-in-the-door technique.*

    Journal of Personality and Social Psychology, 4(2), 195–202.

    Cognitive Dissonance (Forced Compliance)

    Festinger, L., & Carlsmith, J. M. (1959).

    *Cognitive consequences of forced compliance.*

    Journal of Abnormal and Social Psychology, 58(2), 203–210.

    1. The Halo Effect

    Physical Attractiveness Bias

    Dion, K., Berscheid, E., & Walster, E. (1972).

    *What is beautiful is good.*

    Journal of Personality and Social Psychology, 24(3), 285–290.

    Trait Contamination in Evaluation

    Thorndike, E. L. (1920).

    *A constant error in psychological ratings.*

    Journal of Applied Psychology, 4(1), 25–29.

    1. Social Proof

    Conformity to Group Judgment

    Asch, S. E. (1951).

    *Effects of group pressure upon the modification and distortion of judgments.*

    In H. Guetzkow (Ed.), *Groups, Leadership and Men*. Carnegie Press.

    Social Cues in Evaluation

    Axsom, D., Yates, S., & Chaiken, S. (1987).

    *Audience response as a heuristic cue.*

    Journal of Personality and Social Psychology, 53(1), 30–40.

    1. Mere Exposure Effect

    Original Exposure Theory

    Zajonc, R. B. (1968).

    *Attitudinal effects of mere exposure.*

    Journal of Personality and Social Psychology, 9(2), 1–27.

    Natural Social Exposure

    Moreland, R. L., & Beach, S. R. (1992).

    *Exposure effects in the classroom.*

    Journal of Experimental Social Psychology, 28(3), 255–276.

    1. False Memory

    Implanted Childhood Memories

    Loftus, E. F., & Pickrell, J. E. (1995).

    *The formation of false memories.*

    Psychiatric Annals, 25(12), 720–725.

    Meaning-Based Memory Errors (DRM)

    Roediger, H. L., & McDermott, K. B. (1995).

    *Creating false memories.*

    Journal of Experimental Psychology: Learning, Memory, and Cognition, 21(4), 803–814.

    1. Authority Effect

    Obedience to Authority

    Milgram, S. (1963).

    *Behavioral study of obedience.*

    Journal of Abnormal and Social Psychology, 67(4), 371–378.

    Authority in Medical Hierarchies

    Hofling, C. K., Brotzman, E., Dalrymple, S., Graves, N., & Pierce, C. M. (1966).

    *An experimental study of nurse-physician relationships.*

    Journal of Nervous and Mental Disease, 143(2), 171–180.

    by SomewhereBroad7414

    Leave A Reply