Emerging on the buds of fingers, while the fetus is still in the womb, the whorls, arches, and loops on the ends of digits are said to be unique to each individual. Durable as well, even when fingerprints have been damaged or worn, they can usually replace themselves. Here’s how.
Although scientists agree that fingerprints begin to develop around the 10th week and are complete by the end of the 4th month, no one is certain of the precise process that creates them.
One theory holds that a middle skin layer, called the basal layer, is scrunched between the inside layer (the dermis) and the outer skin layer (the epidermis). As the faster-growing basal layer strains against its neighbors, the pressure causes the skin to buckle, forming “folds of the epidermis into the dermis,” and resulting in the complex patterns we see on our fingers today. Nerves are said to play a part in this process as well, as they are hypothesized to be the origin of the “forces that pull in the epidermis.”
Interestingly, it is precisely because the pattern is “encoded at the interface between the dermis and epidermis,” that it becomes nearly permanent and “cannot be destroyed by superficial skin injuries.”
The ridges of fingerprints are particularly susceptible to wear. Bricklaying is often used as an example of a repeated activity that can wear down fingerprints, rendering them unsuitable for personal identification. Likewise, some people (think criminals) have purposely burned off their fingerprints, either with acid or fire.
In fact, cancer sufferers who are treated with certain forms of chemotherapy can also temporarily lose their fingerprints. In a process known as chemotherapy-induced acral erythema, the chemical treatment, capecitabine, causes painful swelling and peeling on the palms of hands and soles of feet – sloughing the fingerprints off with the skin.
However, in most cases, because of the engrained imprinting in the deeper skin layers, once exposure to the abrasive, caustic or hot conditions cease, the fingerprints will grow back.
In some cases, damage to a fingertip extends deeply into the skin’s generating layer, resulting in permanent changes to the fingerprint. Experts note, however, that the scar produced – be it from a burn or a cut – can itself become permanently encoded into the fingerprint pattern.
In addition, the ridges on fingerprints can get thicker and shorter with age, such that the prints of many elderly people can be difficult to discern.
Born that Way
Three known genetic conditions can result in a person being born without fingerprints.
NFJS and DPR
Naegeli-Franceschetti-Jadassohn syndrome (NFJS) and Dermatopathia pigmentosa reticularis (DPR) each have a variety of troubling symptoms including hyperpigmentation, abnormal sweating, anomalies of the hair, teeth, and skin, and, notably, no fingerprints. Both diseases are believed to arise out of keratin-related gene mutations and cell self-destruction that occurs in the basal layer of skin.
Unlike NFJS and DPR, those who suffer from adermatoglyphia have no symptoms other than the loss of fingerprints. First coming to prominence as the afflicted struggled with crossing borders (such as into the U.S.), today it is sometimes referred to as “immigration delay disease.”
Found to run in families, scientists have identified a mutation in a particular protein gene, SMACRAD1, as a potential culprit, as it “plays a critical role in the restoration of heterochromatin organization and propagation of epigenetic patterns . . . .”
The fingertip images found by investigators at crime scenes after sprinkling powder or applying a chemical are called latent fingerprints. Formed by a combination of the sweat and oil from skin coming between the fingertip pattern and a surface, these are often used to identify the perpetrator of a crime.
However, contrary to what is often depicted in Hollywood, evidence latent fingerprints are not foolproof, and several factors can contribute to inaccurate identification. First, no two fingerprints or impressions are ever precisely alike. Second, latent fingerprints collected at crime scenes are often not perfect and frequently are either partial, smudged or dirty prints. Third, at some point, people are involved in the process, which necessarily leaves it open to human error.
As a result, fingerprint identification is not without its detractors. In fact, in a 2011 study involving 169 latent print examiners who were asked to identify 100 pairs of fingerprints from out of a pool of 744, 0.1% of those identifications were false positives – meaning an individual was identified as having made the print when he hadn’t.
Although this is a small percentage, when you consider that in 2013 the FBI received over 60 million tenprint submissions, at a 0.1% error rate, 60,000 false positive matches could have been produced (although, presumably, having to match as many as 10 prints as opposed to 2 as in the study would increase accuracy).
If you liked this article, you might also enjoy subscribing to our new Daily Knowledge YouTube channel, as well as:
Fingerprints are so durable they are often used to help identify a dead person. As one fingerprint expert noted: “If a hand is found in the water you will see that the epidermis starts to come away . . . like a glove. I cut the epidermis off and put my own hand inside that glove and try to fingerprint it like that.” In fact, in 2012, a human finger found in the belly of a fish was traced back (using its fingerprint) to an Idaho man who had lost it several months previously in a wakeboarding accident.
Gorillas, chimpanzees, and koalas all also have fingerprints, a fact that has led some experts to opine that they were adapted to improve grasping and make our ability to regulate pressure and movement more precise.
Stay updated with all the insights.
Navigate news, 1 email day.
Subscribe to Qrius