A study involving 15,400 students suggests that those who take their university exams in rooms with high ceilings perform worse than those who take them in standard rooms. Electroencephalograms revealed an increased activity in the brain region associated with concentration on a challenging task in these large rooms. These results thus offer a rarely explored insight into the influence of interior architecture on our cognitive performance.
Exam venues typically choose themselves based on their scalability and capacity to accommodate a large number of students simultaneously. The objective is both to facilitate logistics and to bring candidates together under the same conditions. This makes it easy to authenticate their identity on a large scale, thereby optimizing the cost-effectiveness of the staff required for supervision. These venues range from specially rented racetracks to large university amphitheaters and standard classrooms.
However, very few studies have explored how the interior layout and characteristics of exam rooms could influence student performance.
The rare study examining the question was more focused on specific parameters, such as familiarity with the environment. Nevertheless, one more recent study shows that the interior space of the exam room is impacted by brain activity markers associated with higher-order cognitive processes (which go beyond memorizing information, such as reflection and analysis).
For its part, a team from the University of South Australia (UniSA) and Deakin University (also in Australia) directly evaluated the influence of exam room size on student results. “Given the association between [previously] identified brain signatures and their role in cognitive performance, we wanted to test the impact of a large room in a real-world context with a cognitive task that has been shown to induce stress (i.e., an undergraduate exam),” the researchers explained in their new study, recently published in the Journal of Environmental Psychology.
Graphic summary of the study showing an overview of the design of experiments.
Increased Brain Activity Related to Concentration on a Difficult Task
To conduct their investigation, the Australian researchers analyzed data from 15,400 undergraduate students across three different university campuses between 2011 and 2019. The researchers compared their exam results with the ceiling heights of the various rooms they were required to take. Ceiling height is generally proportional to room size. We also considered variation factors like age, gender, and the year they took the exams. The analysis also took into account their possible prior experience with the evaluated courses.
Researchers discovered a correlation between poor results and exams taken in rooms with high ceilings. Experts suspect this is partly due to the fact that some of these spaces (such as sports gymnasiums, for example) are mostly reserved for sporting or festive events rather than activities requiring high concentration.
However, it was difficult to determine precisely whether these poor results were truly linked to room size. For instance, it could be due to other factors like the number of students in each room or the insulation quality. These aspects can lead to fluctuations in temperature and air quality, which, in turn, can affect the brain and the body in general.
“The key point is that large rooms with high ceilings seem to disadvantage students, and we needed to understand what brain mechanisms are at play and whether this affects all students to the same degree,” explains the study’s lead author, Isabella Bower, in a UniSA press release.
To confirm their hypothesis (the influence of room size), the researchers invited students to participate in virtual reality experiments simulating exams in a room while recording their electroencephalogram. It was found that simply sitting in a room with a high ceiling led to increased brain activity associated with concentrating on a difficult task.
This data suggests that adjustments are needed so that students taking exams are all on the same level and benefit from the same conditions to succeed. “These findings support the idea that the scale of the built environment influences cognitive performance and argues against holding exams in large, high-ceilinged rooms,” the researchers conclude.
For a few weeks, the Mediterranean lighthouse’s permanent explosive activity has been at a fairly high level. However, the situation changed last Wednesday, when a new effusive phase began from the north crater.
As with each occurrence of this type of activity, the steep slope causes the flow to collapse, generating pyroclastic flows down to the coast. Although this type of activity is not rare at Stromboli, civil protection nevertheless triggered a red alert yesterday, July 4.
For a few weeks now, Stromboli’s usual explosive activity has been at a fairly high level, with a rather high explosive frequency, mainly at the north crater, one of the two active zones of the crater platform. When I was there at the end of May, for example, explosions occurred approximately every 5 to 6 minutes on average, ensuring a spectacular show!
One of the three active vents also displayed “spattering,” a splashing activity that revealed a lava level close to the surface. This sometimes overflowed, creating short ephemeral flows on May 24 and 27, and more recently on June 23 and 28.
A new effusive phase began on July 3. Like the previous ones, the lava overflow seems to have caused part of the cone formed by this intense explosive activity to collapse, resulting in an initial sequence of pyroclastic flows rushing down the Sciara del Fuoco, the steep slope on the northwest flank of the volcano that plunges directly into the Tyrrhenian Sea from the crater area.
https://youtu.be/Bwq9ndPivBs
A More Intense Effusive Phase
The next day, July 4, around 4:10 PM local time, a new effusive vent opened at about 700 meters altitude, just below the north crater. Particularly well-fed, this lava flow reached the coast around 5:30 PM, thus covering 1,000 meters of distance at an average speed of 12 meters per minute! As on the previous day, this also generated pyroclastic flows in the Sciara del Fuoco, with the event at 6:18 PM clearly standing out from the others as this cloud of hot gas and ash spread over the sea for several hundred meters! Moreover, the ash from these collapses formed a cloud that rose to over 2,000 meters in altitude.
While impressive, these various phenomena are not rare at Stromboli.
Because this activity is still more significant than usual, civil protection decided to raise the alert level to red, implying a strengthening of the volcano’s monitoring system. For now, no measures directly affect tourist activity, which is starting to be in full swing on this wonderful little island in the Mediterranean Sea!
Today’s popular songs are significantly simpler in structure compared to those from 70 years ago. This could be due to the development of new genres like disco or hip-hop, as well as technical innovations that have influenced the melodies of popular songs.
From Elvis, the Rolling Stones, and the Beatles, through Pink Floyd, Roxette, and Michael Jackson, to Beyoncé and Taylor Swift, some music is more popular and successful than other works. However, browsing the charts of previous decades, one tends to encounter more musically exceptional songs than in today’s uniform chart successes. Popular songs and earworms also seem to have become noticeably faster over time.
Can we objectively prove this, or is it merely a subjective impression? Moreover, what has driven this development?
Comparing chart Successes
To investigate this, Madeline Hamilton and Marcus Pearce from Queen Mary University of London studied the evolution of popular earworms over the past decades. They analyzed the main melodies of over 1,100 singles that reached the top 5 of the US Billboard charts at the end of each year between 1950 and 2022. They mathematically compared these melodies based on various musical characteristics, such as rhythm and tonal structure, diversity, number, and duration of notes.
Faster, but More Boring?
The analysis revealed that the complexity of rhythms and pitch arrangements decreased during this period.
Simultaneously, the average number of notes played per second increased, as did the number of repetitions, making the melodies faster and more monotonous over time.
Within this trend, two years stood out: 1975 and 2000, where melodic complexity sharply decreased. 1996 also saw a significant change in music history, although it was less drastic than in the other two years.
New Genres Changed Hit Melodies
Hamilton and Pearce suspect that the changes in 1975 could represent the rise of new preferences and music genres such as New Wave, Disco, and Stadium Rock. The “revolutions” in 1996 and 2000 could represent the rise of hip-hop and the introduction of digital audio workstations that allowed the repeated playback of audio loops.
Compared to previous studies, the researchers conclude that these musical “revolutions” took place several years later than previously assumed.
The researchers conclude that these innovations have led to simpler popular melodies. However, they emphasize that this does not necessarily mean music has become less complex overall or “worse.” The quality and combination of sounds could have developed independently of the melodies, or even in the opposite direction.
For example, simpler melodies could also be a consequence of modern songs becoming faster; artists may have avoided complex melodies to not overwhelm listeners. Artists may have focused more on sound quality or other features than melody, which only became possible with digital techniques.
The study also does not reflect the entire history of music. “Since the sample only contains the five most popular songs of each year, it cannot be said to represent US American or Western pop music in general. Therefore, we emphasize that a much larger dataset of melodies is needed to verify the conclusions of the study,” the researchers write.
For the first time, chemists have succeeded in creating an aromatic ring compound consisting only of metal atoms without linked support molecules. The team reports in “Nature Chemistry” that the ring consists of four positively charged bismuth ions, connected to each other and with delocalized bonding electrons. A charge field of two unconnected but neighboring molecules stabilizes this ring compound.
Aromatic compounds are ring-shaped molecules with loosely assigned bonding electrons between the atoms. Instead, they form delocalized orbitals that extend around the entire ring or large parts of it. This gives them unique stability and properties. It’s no wonder, therefore, that an estimated two-thirds of all organic molecules are wholly or partially aromatic.
The framework of such a ring usually consists of carbon atoms, as in benzene, but it can also be formed by other atoms, such as nitrogen or metals. In the latter case, however, only aromatics are known so far in which the ring-shaped connected metal atoms are additionally stabilized by organic molecules covalently linked to them.
Four Metal Ions in the Aromatic Ring
But it’s possible without them, as chemists led by Ravi Yadav from the University of Heidelberg have now demonstrated. For the first time, they have succeeded in creating an aromatic ring consisting of positively charged metal ions without attached molecules. The ring consists of four bismuth cations (Bi+), whose total of 16 bonding electrons are delocalized. It is the first cationic, purely metallic aromatic compound without covalent “support molecules.”
However, this metallic aromatic ring is only possible because there are two larger organic molecules in its immediate vicinity. Although these are not connected to the metal ring, they still serve as stabilizers: “The highly electron-deficient ring is held in the symmetrical charge field formed by two electron-rich calix-pyrrolate units,” the chemists explain. Held by this charge field, the metal ring floats in a cavity between the two accompanying molecules.
Also Possible with Other Metals
Additional tests showed that other metals with similar outer electron configurations, such as lead and tellurium, can also form such cationic aromatic rings. In all cases studied, four-membered rings with delocalized sigma orbitals were formed, as the team reports. “We expect that our approach can be applied as a general method to other areas of stabilizing positively charged rings and cages,” says senior author Lutz Greb from the University of Heidelberg.
At the same time, their experiments also provided new insights into the bonding behavior of metal atoms. “Aromatic compounds made of pure metal atoms initially serve fundamental understanding. However, some unexpected effects in our work point to a new basic concept in the field of aromaticity,” explains Greb. “It could be significant for charge transport in metals.”
Whether it’s disgust, illness, or a swaying ship deck, when we feel nauseous, our appetite disappears. Neurobiologists have now deciphered what happens in the brain during this process, uncovering a previously unrecognized mechanism. According to their findings, during nausea, specific cells in the amygdala fire and send appetite-suppressing signals to areas throughout the brain. Even intense hunger struggles to override their stop signals. However, these neurons do more than just cause loss of appetite, as reported by the team in “Cell Reports.”
There are various factors that can suppress our appetite: Intense stress, an infection, motion sickness, or the sight of something repulsive can make us feel nauseous. We then lose our desire to eat, even if we’re actually hungry. This break from eating gives the body time and resources to focus on immediate issues.
But what triggers the typical loss of appetite during nausea?
While the brain regions and circuits that regulate normal feelings of satiety and hunger are known, it was unclear whether they were also responsible for nausea-induced loss of appetite.
To address this question, Wenyu Ding from the Max Planck Institute for Biological Intelligence in Martinsried and her colleagues focused on a brain region: the amygdala. It is a crucial center for processing emotions, particularly fear.
However, the central part of this brain area also contains neurons that control satiety. But which of these neurons fire during nausea? And how far do these signals extend?
For their experiment, Ding and her team induced nausea in mice using a chemical and then observed, among other methods using fluorescent markers and electrodes, how different groups of neurons in the central amygdala responded. Additionally, they tested the mice’s behavior: did they eat less when nauseous, even though they had previously fasted? How did nausea affect their behavior?
We Have Discovered a Completely New Type of Neuron
Surprisingly, they found something unexpected: In addition to the known satiety neurons in the amygdala, there is another type of brain cell in this area. These so-called Dlk1 neurons, however, do not fire during satiety but during nausea. “We found that these neurons are activated by nausea-inducing agents, bitter tastes, and gastrointestinal disturbances,” the research team reported. These neurons receive signals from many brain regions, including those processing disgust and unpleasant odors.
When these newly discovered brain cells fire, their signals block even strong feelings of hunger. When artificially stimulating these Dlk1 neurons in their brains, hungry mice stopped eating and even drinking. Conversely, turning off the Dlk1 cells in the amygdala caused the mice to eat even when they felt nauseous. This indicates that there is a specific circuit and unique brain cells for appetite loss due to nausea in both mouse and human brains, according to Ding and her team.
More Than Just A Loss of Appetite
But that’s not all. The team also found that these appetite-suppressing cells are unusually widely connected. While the known satiety neurons mainly target neighboring cells within the amygdala, the extensions of the Dlk1 cells send inhibitory signals to distant brain regions. As a result, this inhibitory effect extends even to the so-called parabrachial nucleus, a central interface between the cerebellum and brainstem.
This has consequences that go beyond mere loss of appetite and could also explain typical human behaviors during nausea: When the mice felt nauseous, they were less social than usual and sought less contact with their conspecifics. “However, these effects are not due to anxiety or altered movement,” Ding and her team explain.
Instead, the nausea-activated Dlk1 neurons are responsible for the signals.
The discovery of this specific appetite-blocking circuit provides valuable insights into the effects nausea has on our brains, as well as which neurons and circuits are involved. At the same time, it offers new insights into the complex regulation of our appetite and eating behavior.
The renowned composer Ludwig van Beethoven is hailed as a musical prodigy, but what does his genetic makeup reveal about him? A research team has now scrutinized this more closely. The surprising finding is that, at least as a common marker for musicality, Beethoven’s genome appears to be not particularly outstanding. According to the team, the composer ranks only in the upper midfield in terms of this polygenic index. However, this genetic marker only captures a small aspect of musicality.
Many eminent composers, including Mozart, Beethoven, and Bach, are considered prodigies with exceptional musical abilities. But what distinguishes them? Are certain genes responsible for the remarkable musical abilities of such individuals?
Twin studies suggest that musicality is genetically determined to be about 42 percent. A recent genome-wide association study also identified 69 gene variants that contribute to a strong sense of rhythm and bolster other aspects of musicality.
A Comparative Look at Beethoven’s Genome
At this point, Beethoven and his genetic material come into play. Researchers have already gained initial insights into Beethoven’s genome, particularly the genetic basis of his illnesses, thanks to the preservation of DNA in some locks of the composer’s hair. “We have now extended this approach to musicality,” said Laura Wesseldijk and her colleagues. For this purpose, they searched the composer’s genome for the 69 gene variants associated with musicality and sense of rhythm.
“We calculated this polygenic index for Beethoven and then compared it with two population-based datasets of thousands of modern individuals,” explains Wesseldijk and her team. The comparison groups consisted of approximately 5,600 individuals from a Swedish twin registry and 6,150 individuals from a US genome database, each of whom had been tested for rhythm and musicality.
“We deliberately did not make any predictions about where Beethoven’s polygenic index would lie because, with our study, we wanted to primarily highlight the limitations of this approach,” emphasize the researchers.
Genetically Good, But Not Outstanding
The genome comparisons revealed something surprising: Beethoven ranks in the ninth and eleventh percentiles of both populations with his polygenic musicality index. Thus, the composer has more music-specific gene variants than around 90 percent of the comparison individuals, but this position is not outstanding.
After all, almost ten percent of the comparison individuals have a higher index than he does, as determined by Wesseldijk and her team.
“At first glance, these results seem quite confusing,” writes the research team. “Because Beethoven, one of the most famous musicians in history, scores relatively unremarkably here.” But why? ” Obviously, it would be wrong to conclude solely from the results of the polygenic index that Beethoven’s musical abilities were not particularly remarkable,” explains the team. Because the musician’s compositions prove the opposite.
But what is the reason for Beethoven’s moderate genetic outcome? As Wesseldijk and her colleagues explain, there are several reasons for this: Firstly, polygenic indices like the one examined here only capture a small fraction of genetic effects because they only capture common gene variants, not rare ones. “Secondly, PGIs are approximations at the population level, which can only make limited, accurate predictions for an individual,” the team says.
And finally, the most important point: “Musicality is not a single trait but a multi-component suite of abilities,” explains the researchers. These abilities, in turn, are based on a mixture of various genetic factors, some of which only influence specific aspects of musicality. Therefore, Beethoven’s exceptional musicality most likely stems from genetic factors.
“A Valuable Lesson”
Accordingly, these results do not question that genetic makeup influences a person’s musical talent. However, they also illustrate the limits of genetic tests. “We believe that the significant disparity between this DNA-based prediction and Beethoven’s musical genius is a valuable lesson,” says co-author Simon Fisher of the Max Planck Institute for Psycholinguistics.
“It shows that one should be skeptical when, for example, someone claims that a genetic test can reliably determine whether a child will be musically talented or talented in another field,” the researcher continues.
An ant mill is a phenomenon where a group of migrating ants loses their pheromone trail, becoming separated from their colony. In their search, they create their own trail, resulting in a circular movement. Eventually, the majority of ants in such a circle die due to exhaustion. This phenomenon is rare in nature but has been replicated in laboratories and appears in simulations of ant colonies. Similar occurrences have been observed in processionary caterpillars and fish.
Discovery of the ant mill
In 1910, the American myrmecologist William Morton Wheeler described a case he observed in the laboratory of a spontaneously formed ant circle that lasted for 46 hours. However, the first documented description of an ant mill dates back to 1920 by William Beebe. While exploring the Guyana jungle for the New York Zoological Society, later known as the Wildlife Conservation Society, Beebe encountered a group of wandering ants.
Upon seeing the ants, belonging to the genus Eciton, the next morning, he followed their trail and discovered it formed a closed circle, sometimes as wide as six rows. Beebe was astonished by this revelation and retraced the trail to confirm it. The circle measured 1210 feet (370 meters) in circumference, and by measuring the ants’ average speed, Beebe estimated it took them about two and a half hours to complete the circuit.
American animal behavior scientist Theodore Christian Schneirla (1902–1968) explained the phenomenon in 1944. Drawing from prior research on ant behavior, Schneirla concluded that the circular movement resulted from the chemicals left behind by ants, later identified as pheromones, which caused the ants to follow each other. A circular motion occurs when the leading ants lose the trail of the column and then rediscover the followers’ trail, akin to a tracker following their own footprints.
Schneirla also noted that a similar phenomenon was described as early as 1896 by Fabre, who observed a similar circular motion in the caterpillars of the silkworm.
The presumed explanation for the ant mill phenomenon is the pheromone trail that ants of certain species, engaged in group food gathering, mark on the soil surface during foraging expeditions. The goal is to facilitate a quick and efficient journey to the food source and back to the ant nest.
Using their antennae, located close to the ground, ants perceive the direction and intensity of the odor and strictly follow the pheromone trail.
At some point, due to various reasons, a malfunction occurs in the ant’s algorithm, causing it to start walking in a closed circle, leading its fellow ants into a deadly procession when they encounter its trail. A possible reason for the malfunction is that the food raid, in some cases, lasts too long, and by the time the ant returns home, the scent of the pheromone trail has dissipated.
As a result, halfway through the journey, the ant deviates from its course and turns several times, only to immediately encounter its own trail again. Newly emitted pheromones have the strongest scent, and the ant repeats the cycle along the recently created trajectory.
Small “whirls” occur in terrain of almost any type and are most common where two ant trails pass close to each other or intersect. However, a phenomenon similar to the one described at the beginning of the article occurs only in a large open area without particularly significant irregularities.
Evolution
Kinship
The occurrence of ant mills is known only to migrating ants. In the 20th century, it was assumed that the species exhibiting this behavior did not share a common ancestor where this behavior could have originated. It involved three lineages: Aenictinae and Dorylinae in the Old World and Ecitoninae in the New World, and it was believed to have independently evolved in each of these lines. From 2003 onward, it became evident that the kinship between these lines traced back much further, up to 105 million years ago. Therefore, there could indeed have been a shared ancestor before the complete separation of the Gondwana continent. As of 2023, all three groups are classified under Dorylinae.
In their natural habitat, ant mills are rare due to variations in terrain, where migrating ants occasionally lose their trail, breaking the circle. When such natural obstacles are absent, for example, in a laboratory or on a sidewalk, the formation of an ant mill is almost inevitable.
Because ant mills are infrequent in nature, there is not much evolutionary pressure to eliminate this phenomenon. However, it is noteworthy that an evidently harmful behavioral pattern could persist for over a hundred million years. Explanations lie in two behavioral factors and the reproductive strategy:
The nomadic lifestyle and food-catching habits of migrating ants create a strong evolutionary pressure toward unconditional collective behavior. An ant isolated from the migrating column is nearly doomed, and an individual can hardly secure food alone, as migrating ants primarily hunt prey much larger than themselves.
Additionally, there is the reproductive strategy: unlike many other ants, migrating ants have queens that do not fly, resulting in slow evolution. Due to this combination, there are few evolutionary opportunities and little pressure to eliminate the detrimental collective behavior.
Social insects, such as bees and termites, refer to insects that form colonies and exhibit a hierarchical structure similar to human societies, with a queen and worker ants (or bees). These colonies are essentially family units, differing significantly from human societies in content. The opposite of social insects is solitary insects. Historically, the determination of whether an insect is social relied on the presence of hierarchical divisions within the group.
However, contemporary assessments emphasize the existence of infertile castes. From this perspective, insects exhibiting true social behavior are referred to as genuinely social. Research in this direction has led to the discovery of several insect groups demonstrating true social behavior beyond the mentioned classic groups. Presently, social insects are considered to possess true social characteristics. Nevertheless, due to the considerable differences in their nature, these newly recognized groups are often treated separately. This article focuses on social insects in the classical sense mentioned above.
Some insects, although not forming large colonies or exhibiting hierarchical divisions, are considered subsocial when parents and offspring cohabit. Additionally, the term “parasocial” is used when unrelated individuals form a group. These aspects are significant in considering the evolution of social insects.
Sociality
Most insects do not care for their hatched offspring. While some insects are known to engage in parental care, many abandon their offspring before they mature. In contrast, certain bees and ants not only care for their offspring but also continue to live together even after the offspring have grown, forming large colonies. Insects with such behavior are termed social insects.
In vertebrates, temporary family formations, where parents and offspring form families and engage in social relationships within a population, are common. The term “sociality” is used when a group, including multiple individuals or families, exhibits more complex structures and interactions than a mere gathering. Although challenging to precisely define, sociality generally refers to relationships resembling human societies.
Even compared to social structures in vertebrates, bees and ants forming large colonies appear remarkably similar to human societies. For instance, the presence of classes such as queens and worker bees, each with specific roles, resembles aspects of human societies.
This has led to comparisons between insect and human societies. However, upon closer examination, differences emerge, such as morphological variations based on caste differences, reproduction limited to queen individuals, and the majority of colony members being siblings. These aspects distinguish insect societies significantly from vertebrate societies.
The social lifestyle of social insects has proven highly successful. Social insects constitute a significant portion of the animal biomass in terrestrial ecosystems. In the tropical rainforests of Brazil, for example, bees and termites make up 80% of the insect biomass, with bee biomass reaching nearly four times that of all non-fish vertebrates. Despite comprising only 2% of insect species globally, estimates suggest that social insects occupy 50% of the current biomass.
As a result, these insects play a crucial role in natural ecosystems. Bees serve as pollinators for flowering plants, wasps act as predators of insects, and termites play a vital role in decomposing plant matter, especially in tropical regions. Ants, with their diverse diets and lifestyles, contribute to various ecological aspects, including predation on small animals, seed dispersal, symbiosis with other organisms, and soil improvement.
Superorganism
In social insects, where members depend on each other for survival, and individual survival is challenging, as reproduction occurs within the group and involves the creation of a new colony, some consider the entire colony as equivalent to a single organism, terming it a superorganism. The concept was first proposed by the ant researcher William Morton Wheeler, who extensively studied social insects. He referred to these insect colonies as superorganisms. Although criticized by some, such as Kinji Nishinow, who disregards the existence of male bees, there is still recognition of the colony as a unit corresponding to an individual.
Various Social Insects
Sociality in Bees
Within the order Hymenoptera (including ants, where ants belong to the family Formicidae and taxonomically fall under Hymenoptera), there exists a variety of social behaviors, ranging from highly social to subsocial and solitary. Examples of traditionally considered social insects include ant species, potter wasps, hornets, and honeybees.
The societies of social bees and ants are run exclusively by females. From fertilized eggs, females (queens) and males are born. The queen, after mating with a male bee, constructs a nest alone. Male bees die after mating with the queen and do not contribute to nest-building. The queen continues to lay eggs while caring for hatched larvae.
Once the larvae mature into adult worker bees, they stay in the nest to assist the queen with tasks such as childcare, foraging, and nest construction, without participating in reproduction. In most bee species, queens and male bees are born in autumn, after which they leave the nest and mate, and while the queen overwinters, the rest of the colony perishes. Consequently, many bee colonies last only one year (although honeybees and certain ants can maintain nests for multiple years).
Ants, with very few exceptions, are fundamentally social insects. For instance, the trap-jaw ant exhibits subsocial behavior without a queen, but this is considered a secondary adaptation. Some ant species develop soldier ants with well-developed mandibles.
Sociality in Termites
All termites exhibit social behavior. When winged termites leave the nest and mate, the resulting male-female pairs construct a nest. These individuals become the king and queen, engaging in repeated mating and egg-laying.
The offspring initially resemble their parents and, upon reaching a certain stage of development, transform into worker termites, assisting the king and queen with nest construction and other tasks. Some of their offspring develop into soldier termites as they grow. Soldier termites do not engage in reproduction.
The remaining worker termites, including some becoming winged termites, venture out of the nest. Many termite colonies endure through the year.
Colony Management
10 feet high termite mound in Botswana.
In many social insects, reproductive individuals (bee queens, termite kings, and queens) solely handle reproduction, while all other tasks are performed by workers (worker bees or worker ants). However, in species like hornets, where only reproductive individuals can survive the winter, reproductive members initially handle all tasks, from nest-building to foraging.
After worker emergence, they stay in the nest and focus exclusively on reproduction. In termites, some colonies build nests within wooden materials, while others, particularly in tropical regions, forage outside for food. In such cases, numerous individuals use pheromones as markers to navigate between the nest and food sources. In honeybees, the well-known figure-eight dance is performed to communicate the location of food sources to other colony members. Worker roles include carrying food and maintaining and managing the nest, as well as caring for larvae and reproductive individuals.
In species where a single individual (or pair) serves as the reproductive entity within a nest, there are instances where candidate reproductive individuals emerge from larvae within the nest if the reproductive members die. These are known as replacement reproductives.
When one of these becomes a new reproductive, the others are killed. Reproductive individuals release pheromones that suppress the appearance of other reproductive individuals. Many of these insects engage in behaviors like trophallaxis, the sharing of food mouth-to-mouth and facilitating pheromone transmission. Notably, species like the polygynous Myrmecina nipponica create large colonies with multiple reproductive individuals (multiple queens) and multiple nests, numbering in the tens of thousands to hundreds of thousands.
Evolution of Sociality
Charles Darwin himself faced difficulties explaining the treatment of social insects. Worker bees do not reproduce, and since traits are not passed on to offspring without reproduction, this posed a challenge.
One proposed explanation was the “queen manipulation” theory. It suggested that the queen, through pheromones, transforms offspring into worker bees. This trait, making one’s own offspring into worker bees, was selected through the queen, making childcare easier and allowing for the production of many offspring. However, this theory couldn’t dismiss the possibility of rebellion among worker bees. If a spontaneous mutation occurred in worker bees, rejecting the queen’s dominance, they could autonomously cease producing their own offspring.
This dilemma was resolved by Hamilton’s kin selection theory. This theory begins with the idea that in natural selection, it is not the individual but the traits expressed by the individual that are selected. An individual survives because it possesses certain traits, and the genes underlying those traits are selected. Therefore, the perspective shifts to the standpoint of individual genes, considering kinship.
Explaining this using the example of humans: when a parent has a child, from the parent’s perspective, the child carries half of the parent’s genes. Considering sibling relationships, the probability of one gene existing in the other is also 1/2. Thus, the gene responsible for caring for one’s daughter and the gene responsible for caring for one’s sister have similar chances of success.
In insects like ants and honeybees (Hymenoptera), where fertilized eggs become females and unfertilized eggs become males, the probability of one gene existing in the other between sisters born from the same parents is 3/4. In this case, the gene responsible for caring for one’s sister becomes more likely to be passed on to future generations than the gene responsible for caring for one’s daughter.
With this perspective, it becomes apparent that in Hymenoptera or any animal with a typical sex determination system, in groups with close kinship, supporting parents and increasing siblings without having one’s own offspring aligns with the goal of preserving one’s genes. If there are genes that induce behavior to assist parents in childcare instead of producing one’s own offspring, and if this behavior allows for a greater chance of preserving genes than producing one’s own offspring, and if it succeeds, those genes will survive through natural selection.
In this way, the existence of worker bees in social insects can be explained through the theory of natural selection. This led to the idea that the characteristic of social insects involves the presence of an infertile hierarchy. Kin selection theory has become a foundation for the development of sociobiology.
The New Meaning of Social Insects
In this way, the existence of an infertile hierarchy was shown to be a significant characteristic of social insects. This is unique even when compared to the social structures found in mammals. Consequently, this type of sociality is referred to as “eusociality.” Among previously known social insects, only some bees (and ants) and termites exhibit eusociality. E.O. Wilson defined eusociality as having:
The presence of an infertile caste.
Multiple generations living together.
Cooperative care of young individuals.
With the foundation for eusociality now apparent, it became conceivable that there might be other eusocial insects. Due to the operation of kin selection, creating groups and maintaining high relatedness within those groups is crucial. Consequently, new instances of insects with infertile castes were discovered. In aphids, for example, soldier aphids were observed.
Aphids, after settling on plants through winged females and reproducing asexually, establish large colonies. Some aphids born in these colonies have sharp mouthparts and scythe-like front legs. They cling to attackers and defend the colony. However, these individuals do not survive to maturity. Since these aphids are clones born through asexual reproduction from the same mother, the relatedness is higher than in termites or bees, making the development of true eusociality more likely.
Subsequently, true eusociality was discovered in other insects, as well as in non-insect species such as shrimp-like creatures and mammals like naked mole rats. These organisms do not disperse, and high relatedness among individuals is maintained through inbreeding.
True eusociality is not exclusive to insects but is collectively referred to as “eusocial animals” or “eusocial organisms.” However, in biology, when “sociality” is mentioned without distinction, it may refer specifically to eusociality. In this context, humans are not considered eusocial animals. Currently, the naked mole rat is the only mammal known to exhibit true eusociality.
Ants, Bees, and Termite Nests
Some of these insects build massive nests, creating new environments and harboring diverse ecosystems. While they cultivate fungi and domesticate small animals such as aphids and scale insects in special cases, various small animals, similar to rodents or cockroaches in the case of humans, or freeloaders, food guests, or brazen burglars, inhabit their nests. In plants, the accumulation of food and excrement around the nest can increase soil nitrogen levels, and ants eliminate insects around the nest, resulting in selective sightings in the vicinity of the nest. These phenomena are described by terms like “myrmecophily” or “termitophily.”
Development of Sociality
Among bees, there are various stages of life, from solitary living to family living, and even true eusociality. Comparing these habits provides insights into how sociality evolved. There are two lineages of social bees. Some have species of subsociality where parents and offspring coexist, and it is suggested that true sociality evolved from such species. This was proposed by Wheeler, and the research by Kojiro Iwata also supported this theory.
Examples include predatory species like longhorned bees and hornets. The ancestors of these were apparently predatory wasps like bee wolves and mud daubers. Predatory wasps anesthetize insects or other prey to feed their larvae. While many close their nests after laying eggs, some add additional prey midway. From such practices, family life involving childcare evolved, leading to the development of advanced sociality as seen in honeybees.
Species like honeybees, which feed on flower pollen and nectar, have various solitary relatives among the bee family, such as carpenter bees and small carpenter bees. Many solitary bees store pollen and nectar in their nests and close the nest after laying eggs. From such solitary habits, they evolved through small-scale family group living, similar to mason bees, to the large-scale advanced sociality seen in honeybees.
On the other hand, the evolution from communal nesting to cooperative nesting and from such side-societal species to true eusociality is also discussed. This was proposed by C.D. Michener, suggesting that reproductive females gather to form nests, and then, through some means, all but one female loses reproductive ability, leading to true eusociality. In reality, in social bees and ants in temperate regions, groups mostly begin with a solitary female.
However, in tropical social bees, nest formation with multiple females is more common. In some cases of longhorned bees, within a group founded by multiple females, there is a clear linear hierarchy among females, and only the top female lays eggs. This hierarchical system was anticipated to contribute to the advancement of comparative sociology by allowing comparisons with the flocking behavior observed in birds and others. However, the subsequent progress seems to suggest otherwise.
Termites are all eusocial. Unlike bees, termites do not perform direct feeding for their offspring, as termite larvae feed themselves. Therefore, there is a completely different history compared to bees. It is unclear how termites developed this way, but they possess symbiotic microorganisms capable of cellulose decomposition, enabling them to feed on wood. Born offspring need to obtain these microorganisms through mouth-to-mouth feeding from their parents, possibly leading to the development of family life. Furthermore, the abundance of inbreeding due to enclosed living in materials like wood may have increased relatedness, although this is debated.