If you happen to drop by, I have started a new blog with my own domain. New topic, new look, but if you happen to be interested, you'll find it here: www.anxietyreallysucks.com
The Beautiful World of Science
A blog about life, the world of science, and all that's in between. Neuroscience, genetics, linguistics... There are many wonderful things in our world, and I want to share the ones I find interesting.
Sunday, February 10, 2013
Wednesday, January 30, 2013
Introducing: Massive Attack
For many of you, Massive Attack is probably not a new name (if it is, you may recognize the instrumental portion of their song "Teardrop" as the theme song for House). I just recently purchased their third studio album Mezzanine and I am in love with their music. The British duo, consisting of Robert Del Naja and Daddy G, helped to popularize the "trip hop" genre of music. Trip hop is a fairly downtempo electronic music that originated in Britain and has been around for about two decades now.
Massive Attacked blasted into the music scene with their debut album Blue Lines in 1991 and have been going strong ever since. Incidentally, both Blue Lines and Mezzanine were included in Rolling Stones's list of the 500 Greatest Albums of All Time.
I have many favorite Massive Attack songs, but the one that I will leave with you is called "Angel." It's a haunting, eerie tune that creeps into your soul and makes you shiver. Enjoy.
Massive Attacked blasted into the music scene with their debut album Blue Lines in 1991 and have been going strong ever since. Incidentally, both Blue Lines and Mezzanine were included in Rolling Stones's list of the 500 Greatest Albums of All Time.
I have many favorite Massive Attack songs, but the one that I will leave with you is called "Angel." It's a haunting, eerie tune that creeps into your soul and makes you shiver. Enjoy.
Monday, January 28, 2013
Six Cognitive Biases that are Probably Preventing you from Thinking Rationally (Part 2)
4) Hindsight Bias
The hindsight bias is closely related to the confirmation bias; both are examples of how overconfident we can be in our own thoughts and beliefs. The hindsight bias is the tendency to overestimate how predictable something was after it has occurred ("I knew it all along!"). For example, when you get an A on a midterm, you "knew" it would happen because you studied so hard. When you ran a red light and nothing bad happened, you "knew" there was no one coming. This bias can be particularly troublesome to students who use practice exams to study for finals: instead of trying the problems, the student looks at the answer key. "I knew that was going to be the answer." Well, no, you probably didn't.
Cognitive distortions like the hindsight bias probably derive from cognitive dissonance: the discomfort we feel when we hold two or more conflicting ideas. "I thought I was going to get an A on that midterm, but I got a B." These two conflicting thoughts make you feel quite uncomfortable; so you rectify them. "Well, it wasn't a fair exam. I have an unfair professor." Hindsight bias is one of the many ways we can reduce the amount of cognitive dissonance we feel. Obviously, when I'm right I knew it all along. When I'm wrong, something happened that was completely out of my control.
5) The Gambler's Fallacy
The Gambler's fallacy is the false belief that a purely random process can become more predictable over time. For example, if in tossing a coin 100 times, you get 100 heads, then the next coin toss must be tails. The human mind has a tough time accepting that future events are not influenced by past events. No matter how many time you flip a fair coin, the chance you will get tails will never fluctuate from 50%. This fallacy is much more general than just coin-tossing: people tend to think that past failures indicate a higher chance of future success in many areas of life. (Although, as a caveat, I should add that there are many situations in which past events do influence future outcomes - when the probability of different events is not independent, like in card games when cards are drawn without replacement.)
Amos Tversky and Daniel Kahneman, two leading figures in the psychology of decision making, believed that the Gambler's fallacy comes from another cognitive bias called the representativeness heuristic: the idea that past experiences are representative samples that can be used to predict future outcomes (you'll notice that all of these biases are starting to sound rather similar). When watching the tossing of a coin, we have a mental image of an equal number of heads and tails, so when we encounter a streak of heads, we will expect a streak of tails to follow.
6) The Availability Heuristic
There are many heuristics that we use to make decisions, but I think the one plaguing most people is the availability heuristic, which occurs when we make decisions or judgment based on how easily information comes to mind. For example, if you're out buying a new car, and you remember your uncle telling you about how much he loves his new Honda, you may be persuaded to buy a Honda. When memories are available to us, they can be quite persuasive. Many students unknowingly use this tactic to answer multiple choice questions; read the question, and whatever comes to mind is probably the right answer. Professors can take advantage of this mental short-cut to trip up their students and make misleading questions.
This heuristic can be used to make a lot of false assumptions, and can enhance the effects of an illusory correlation (a situation where two things seem related, but are not). For example, say you have two friends from Belgium. Both of them happen to be fans of the show Friends. If asked whether most Belgium people like the show Friends, you'll probably say yes. Hopefully you can see how irrational this logic is; two individuals is hardly a representative sample of a population of 11 million. We use the availability heuristic in many different areas of our lives to make decisions, both trivial and important.
The hindsight bias is closely related to the confirmation bias; both are examples of how overconfident we can be in our own thoughts and beliefs. The hindsight bias is the tendency to overestimate how predictable something was after it has occurred ("I knew it all along!"). For example, when you get an A on a midterm, you "knew" it would happen because you studied so hard. When you ran a red light and nothing bad happened, you "knew" there was no one coming. This bias can be particularly troublesome to students who use practice exams to study for finals: instead of trying the problems, the student looks at the answer key. "I knew that was going to be the answer." Well, no, you probably didn't.
Cognitive distortions like the hindsight bias probably derive from cognitive dissonance: the discomfort we feel when we hold two or more conflicting ideas. "I thought I was going to get an A on that midterm, but I got a B." These two conflicting thoughts make you feel quite uncomfortable; so you rectify them. "Well, it wasn't a fair exam. I have an unfair professor." Hindsight bias is one of the many ways we can reduce the amount of cognitive dissonance we feel. Obviously, when I'm right I knew it all along. When I'm wrong, something happened that was completely out of my control.
5) The Gambler's Fallacy
The Gambler's fallacy is the false belief that a purely random process can become more predictable over time. For example, if in tossing a coin 100 times, you get 100 heads, then the next coin toss must be tails. The human mind has a tough time accepting that future events are not influenced by past events. No matter how many time you flip a fair coin, the chance you will get tails will never fluctuate from 50%. This fallacy is much more general than just coin-tossing: people tend to think that past failures indicate a higher chance of future success in many areas of life. (Although, as a caveat, I should add that there are many situations in which past events do influence future outcomes - when the probability of different events is not independent, like in card games when cards are drawn without replacement.)
Amos Tversky and Daniel Kahneman, two leading figures in the psychology of decision making, believed that the Gambler's fallacy comes from another cognitive bias called the representativeness heuristic: the idea that past experiences are representative samples that can be used to predict future outcomes (you'll notice that all of these biases are starting to sound rather similar). When watching the tossing of a coin, we have a mental image of an equal number of heads and tails, so when we encounter a streak of heads, we will expect a streak of tails to follow.
6) The Availability Heuristic
There are many heuristics that we use to make decisions, but I think the one plaguing most people is the availability heuristic, which occurs when we make decisions or judgment based on how easily information comes to mind. For example, if you're out buying a new car, and you remember your uncle telling you about how much he loves his new Honda, you may be persuaded to buy a Honda. When memories are available to us, they can be quite persuasive. Many students unknowingly use this tactic to answer multiple choice questions; read the question, and whatever comes to mind is probably the right answer. Professors can take advantage of this mental short-cut to trip up their students and make misleading questions.
This heuristic can be used to make a lot of false assumptions, and can enhance the effects of an illusory correlation (a situation where two things seem related, but are not). For example, say you have two friends from Belgium. Both of them happen to be fans of the show Friends. If asked whether most Belgium people like the show Friends, you'll probably say yes. Hopefully you can see how irrational this logic is; two individuals is hardly a representative sample of a population of 11 million. We use the availability heuristic in many different areas of our lives to make decisions, both trivial and important.
Saturday, January 26, 2013
Six Cognitive Biases that are Probably Preventing You from Thinking Rationally (Part 1)
We'd all like to think that we see the world in an objective, rational manner. Unfortunately, our minds are hardwired to see the world in an egocentric frame of reference, and thus information tends to get distorted, lost, or invented. Over the course of evolution, the human brain has evolved many mechanisms for extracting the most relevant information it can out of the surrounding environment without having to overload its senses. One broad category of such mechanisms is the cognitive bias. Cognitive biases have evolved to speed up judgment times and make decisions easier to come by. However, by decreasing judgment time, your brain often sacrifices objectivity and precision. Decision-making mechanisms that speed up reaction time (while sacrificing accuracy) are referred to as heuristic, and there have been many identified in cognitive psychology. Cognition can also be persuaded by goals or motivational states (we often perceive things as we'd like to see them). Below, I will list six cognitive biases that are probably affecting the way you see the world.
1) The Fundamental Attribution Error
This one has to do with attributions, the ways in which you describe and interpret the behaviour of others. The fundamental attribution error occurs when you underestimate situational influences on another person's behaviour and overestimate the effects of that person's personality. For example, if you see a woman yelling at her child in a supermarket, you are likely to infer that she is an angry person, maybe even a bad mother. Why should that be your first impression when it is equally likely she's just having a bad day?
There are a few explanations for how this bias arose, but the explanation I like best has to do with another cognitive bias: the just-world hypothesis. This bias presumes that the world is a just place (we'd all like to believe that, right?) and so, basically, people get what they deserve. This is the bias that surrounds ideas like karma. Attributing other people's failures to their dispositional downfalls rather than situational causes allows us to keep believing that other people reap what they sow.
2) Confirmation Bias
Humans have an overwhelming tendency to confirm their own beliefs. Hence, it shouldn't come as a surprise that we're hardwired to do just that. The confirmation bias is the tendency to collect information from our surroundings that supports ideas or beliefs that we have already formulated. If you think you're allergic to your roommate's new perfume, than you'll notice the few times you sneeze while your roommate is wearing her perfume - while you miss out on the hundreds of times you sneeze while petting your cat. It's hard to think objectively when your brain is constantly searching for information that validates your beliefs.
The confirmation bias can even work its way into the realm of science. A well known problem in academia is the so-called publication bias (otherwise known as the "file drawer effect") which is the tendency to publish positive results and ignore negative or inconclusive results.
3) The Misinformation Effect
You may be aware that witness testimonies are not terribly accurate. Furthermore, you may know that leading questions (questions that direct the witness to a particular answer) are restricted in eliciting a testimony in court. Why is this? Well, it has to do with the misinformation effect: the tendency for information presented after an event to alter recall of that memory. The memory can be altered by presenting an individual with convincing post-event information that can shape the way the memory is recalled.
Memory traces are normally stabilized through a process called consolidation. Consolidation is dependent on protein synthesis at the molecular level and the hippocampus at the structural level. When a memory is retrieved, it enters a brief labile state where it must be reconsolidated. During this process, it is possible that new information can disrupt the reconsolidation of the memory and thus alter it. While still theoretical at this point, reconsolidation provides a promising framework for how our memories can be so inaccurate and misleading at times.
1) The Fundamental Attribution Error
This one has to do with attributions, the ways in which you describe and interpret the behaviour of others. The fundamental attribution error occurs when you underestimate situational influences on another person's behaviour and overestimate the effects of that person's personality. For example, if you see a woman yelling at her child in a supermarket, you are likely to infer that she is an angry person, maybe even a bad mother. Why should that be your first impression when it is equally likely she's just having a bad day?
There are a few explanations for how this bias arose, but the explanation I like best has to do with another cognitive bias: the just-world hypothesis. This bias presumes that the world is a just place (we'd all like to believe that, right?) and so, basically, people get what they deserve. This is the bias that surrounds ideas like karma. Attributing other people's failures to their dispositional downfalls rather than situational causes allows us to keep believing that other people reap what they sow.
2) Confirmation Bias
Humans have an overwhelming tendency to confirm their own beliefs. Hence, it shouldn't come as a surprise that we're hardwired to do just that. The confirmation bias is the tendency to collect information from our surroundings that supports ideas or beliefs that we have already formulated. If you think you're allergic to your roommate's new perfume, than you'll notice the few times you sneeze while your roommate is wearing her perfume - while you miss out on the hundreds of times you sneeze while petting your cat. It's hard to think objectively when your brain is constantly searching for information that validates your beliefs.
The confirmation bias can even work its way into the realm of science. A well known problem in academia is the so-called publication bias (otherwise known as the "file drawer effect") which is the tendency to publish positive results and ignore negative or inconclusive results.
3) The Misinformation Effect
You may be aware that witness testimonies are not terribly accurate. Furthermore, you may know that leading questions (questions that direct the witness to a particular answer) are restricted in eliciting a testimony in court. Why is this? Well, it has to do with the misinformation effect: the tendency for information presented after an event to alter recall of that memory. The memory can be altered by presenting an individual with convincing post-event information that can shape the way the memory is recalled.
Memory traces are normally stabilized through a process called consolidation. Consolidation is dependent on protein synthesis at the molecular level and the hippocampus at the structural level. When a memory is retrieved, it enters a brief labile state where it must be reconsolidated. During this process, it is possible that new information can disrupt the reconsolidation of the memory and thus alter it. While still theoretical at this point, reconsolidation provides a promising framework for how our memories can be so inaccurate and misleading at times.
Monday, January 21, 2013
Pharmacogenomics: Why Drugs Don't Work as Well as We'd Like
We live in a world of massive scientific breakthroughs. In such an age of discovery and innovation, it is frustrating to learn that things never work as well as they theoretically should - as it turns out, the world doesn't exist in a vacuum. Pharmaceuticals are no different. According to the Journal of the American Medical Association, there are over 100 000 deaths a year in the U.S. attributable to adverse drug reactions. Is there some way we could predict (and eventually prevent) ADRs? Well, yes.
That's where the study of pharmacogenomics comes in: by studying the common polymorphisms (genetic variants) related to the particular metabolic effects of drugs, we can alter individual pharmacotherapy approaches to better suit each patient. How is it that genetic variation can lead to different drug effects? There are two frameworks for understanding how genotype and phenotype interact: pharmacokinetics and pharmacodynamics. Pharmacokinetics refers to how the body absorbs, metabolizes, distributes, and excretes drug products. For example, variation in the gene that codes for an enzyme that actives a drug could cause the enzyme to more or less likely to bind to the drug target. This could change the rate at which the drug becomes therapeutic. The second framework, pharmacodynamics, refers to the drug targets themselves - ion channels, receptors, enzymes, and the immune system. For example, variability in an ion pump that expels toxic drug by-products from the cell can alter the rate at which an individual experiences damaging side-effects.
Are the medications you're taking harming you because of your genetic make-up? |
For a real example, take the opiate analgesic (painkiller) codeine. Codeine is a pro-drug, meaning it must be activated by enzymes in the body before it takes effect. The conversion of codeine from its inactive form to its active form (morphine) is accomplished by the activity of the enzyme CYP2D6, a member of the cytochrome P450 superfamily responsible for drug metabolism and bio-activation. Genetic variation in the region of the genome encoding the CYP2D6 enzyme can greatly alter codeine's efficacy. An allele that causes decreased affinity of the enzyme for its drug target would prevent the drug from reaching its full analgesic effects. Conversely, if the enzyme was super-active and metabolized the drug too quickly, there may be no analgesic effect at all. If we were able to genotype each individual for polymorphisms in the CYP family of enzymes, we could predict which drugs would be most effective, and at which dosages.
Enter the evolving world of personalized medicine. With human genomics rapidly advancing, some day it may be quite possible to get a personalized profile of which drugs at which dosages you respond best to. It sounds a bit like science fiction, but really, we're not that far off.
Thursday, January 17, 2013
Mind and Body: Dual Entities, or a Redundant Distinction?
Many of you have probably heard of the mind-body problem: What is the relationship between mind and matter, between consciousness and the brain? Is the mind a distinct entity, not reproducible to the basic firing of neurons? Or is consciousness merely a result of brain chemicals interacting with receptors?
The mind-body problem can be traced back to Descartes, who believed that the immaterial mind interacted with the material body through the pineal gland (a small endocrine gland located in the center of the brain that produces melatonin, the hormone involved in the sleep-wake cycle). Since Descartes, many philosophers have addressed the question. Their approaches can be broadly categorized into monism (the idea that mind and matter are two aspects of the same thing) and dualism (which assumes a rigid distinction between mind and body).
As a firm believer in the scientific method and the wonders of scientific inquiry, I find it hard to believe in an immaterial mind that exists in parallel to the body. It seems we need to turn to cognitive neuroscience to answer this age-old question. However, finding neural correlates for the subjective experiences we term "consciousness" is certainly no simple task. In fact, David Chalmers, an Australian philosopher specializing in the philosophy of mind, calls the task the "Hard Problem of Consciousness." The Hard Problem contrasts with the Easy Problem, which deals with how we obtain and integrate information (which is much easier, since we can point to specific brain regions that deal with these types of processes). How can we prove that the subjective experience of consciousness is caused by physical processes, such as simple neural processes? Only time (and innovative science) will tell.
The mind-body problem can be traced back to Descartes, who believed that the immaterial mind interacted with the material body through the pineal gland (a small endocrine gland located in the center of the brain that produces melatonin, the hormone involved in the sleep-wake cycle). Since Descartes, many philosophers have addressed the question. Their approaches can be broadly categorized into monism (the idea that mind and matter are two aspects of the same thing) and dualism (which assumes a rigid distinction between mind and body).
As a firm believer in the scientific method and the wonders of scientific inquiry, I find it hard to believe in an immaterial mind that exists in parallel to the body. It seems we need to turn to cognitive neuroscience to answer this age-old question. However, finding neural correlates for the subjective experiences we term "consciousness" is certainly no simple task. In fact, David Chalmers, an Australian philosopher specializing in the philosophy of mind, calls the task the "Hard Problem of Consciousness." The Hard Problem contrasts with the Easy Problem, which deals with how we obtain and integrate information (which is much easier, since we can point to specific brain regions that deal with these types of processes). How can we prove that the subjective experience of consciousness is caused by physical processes, such as simple neural processes? Only time (and innovative science) will tell.
Monday, January 14, 2013
Does Language Shape How We Think?
If you've ever taken a linguistics or cognitive psychology class, then you've probably heard of the Sapir-Whorf Hypothesis (otherwise known as linguistic relativity), which posits that the structure of language affects the way we see the world. You can perhaps think of hypothetical examples - the number of different colours a language contains may alter the way its speakers see colours in the physical world, the ways a particular language refers to aspects of time may affect the way speakers experience the real flow of time. Much of the research that went into this popular theory has been quite flawed, however. A glaring example is the idea that Eskimo languages have an unusually large number of words for "snow." This has become known as the "Great Eskimo Vocabulary Hoax" since it has been since demonstrated that Eskimo languages have no more words for snow than the English language does.
Noam Chomsky, the renowned linguist, cognitive scientist, and political thinker, disagrees with the notion that language shapes thought. Chomsky revolutionized the field of linguistics when he posited that all human languages had a core component that was hardwired at birth - a concept called Universal Grammar. Chomsky believes that language acquisition can be broken down into two components: principles and parameters. This view posits that grammatical principles underlying language are innate and hardwired from birth, while certain parameters must be learned after birth (such as whether the language calls for agreement between a verb and a subject, whether affixes can be added onto words, or whether the language is subject-object-verb or subject-verb-object). Chomsky believes that all humans are programmed for language, and that we do not simply learn language from our surroundings, but rather turn certain linguistic "switches" on and off depending on the language we're first exposed to.
Surely, language must shape the way we see the world in some ways, right? Do we not think in words? Doesn't that challenge the view that language does not shape thought? The jury seems to be out on this debate. What do you think?
Noam Chomsky, the renowned linguist, cognitive scientist, and political thinker, disagrees with the notion that language shapes thought. Chomsky revolutionized the field of linguistics when he posited that all human languages had a core component that was hardwired at birth - a concept called Universal Grammar. Chomsky believes that language acquisition can be broken down into two components: principles and parameters. This view posits that grammatical principles underlying language are innate and hardwired from birth, while certain parameters must be learned after birth (such as whether the language calls for agreement between a verb and a subject, whether affixes can be added onto words, or whether the language is subject-object-verb or subject-verb-object). Chomsky believes that all humans are programmed for language, and that we do not simply learn language from our surroundings, but rather turn certain linguistic "switches" on and off depending on the language we're first exposed to.
Surely, language must shape the way we see the world in some ways, right? Do we not think in words? Doesn't that challenge the view that language does not shape thought? The jury seems to be out on this debate. What do you think?
Saturday, January 12, 2013
Introducing: Moloko
I will never claim that I have a particularly nuanced or adroit taste in music, but I do like sharing some lesser known musical talents that I have come across.
The UK-based electropop duo Moloko consists of Róisín Murphy from Ireland and Mark Brydon from England. The band's name comes from the slang used in Anthony Burgess's classic novel A Clockwork Orange, where it means "milk." The duo came to the musical scene in 1995 with their first studio album Do You Like My Tight Sweater?. A pleasant mix of electropop, trip-hop, and house music, Moloko's melodies are enchanting and oftentimes quite cheerful.
Here is one my favorite Moloko songs, called Familiar Feeling, from their fourth studio album Statues. Enjoy!
The UK-based electropop duo Moloko consists of Róisín Murphy from Ireland and Mark Brydon from England. The band's name comes from the slang used in Anthony Burgess's classic novel A Clockwork Orange, where it means "milk." The duo came to the musical scene in 1995 with their first studio album Do You Like My Tight Sweater?. A pleasant mix of electropop, trip-hop, and house music, Moloko's melodies are enchanting and oftentimes quite cheerful.
Here is one my favorite Moloko songs, called Familiar Feeling, from their fourth studio album Statues. Enjoy!
Friday, January 11, 2013
A Brief Introduction to Genetics
One of my biggest interests lies in psychiatric genetics, the study of how particular variants in the human genome contribute to mental illness. For today's post, I thought it would be helpful to give a quick-and-dirty review of genetics, for those of us who can't quite remember much from high school biology.
The blueprint of every organism is contained in long molecules of an organic chemical called deoxyribonucleic acid (DNA). DNA is a polymer of nucleotides - in other words, DNA is made up of a long chain of four different nitrogenous bases (adenine, thymine, guanine, and cytosine) in a particular order. DNA is found in the nucleus of every human cell, where it is organized into larger structures called chromosomes. Humans have 23 pairs of chromosomes (one set from each parent). The composite of all the DNA in a human cell is called the human genome, and it is functionally organized into units called genes. A gene codes for a particular protein, although the relationship is a little more complicated than just that. The ultimate goal of the genome is to make proteins - proteins can control biological reactions, make and support structures, alter gene expression, and have many other jobs. A gene is located at a particular locus on the genome, and there can be multiple copies of that gene (referred to as alleles).
Of particular interest in the field of genetics these days are genetic polymorphisms. A polymorphism is a variable region of DNA in which the rarest variant cannot be maintained by mutation alone. What this means is that polymorphisms arise through mutations in the genome, and then are maintained as they are passed along to offspring. The simplest kind of polymorphism is the single nucleotide polymorphism (SNP). A SNP is a single nucleotide in the long chain of DNA that is variable between different people. Polymorphisms can alter the structure of proteins, making them either more or less functional. They can also affect the expression level of the protein product.
While polymorphisms tend to not be deterministic (meaning that they do not directly cause something to happen), they can modify or increase susceptibility to a particular disease. For example, a particular copy of the ApoE gene (which is involved in lipid transport and metabolism) can increased your likelihood of developing Alzheimer's disease.
One last thing I wanted to add is that genes do not necessarily determine the function of the organism. Just because you may have a family history of heart disease does not mean that you will die of heart disease. The environment is a large source of variation as well, and it can interact with genes through a mechanism called epigenetics to alter gene products. In almost all cases, you can overcome diseases you are genetic susceptible to if you just maintained a healthy environment around you.
The blueprint of every organism is contained in long molecules of an organic chemical called deoxyribonucleic acid (DNA). DNA is a polymer of nucleotides - in other words, DNA is made up of a long chain of four different nitrogenous bases (adenine, thymine, guanine, and cytosine) in a particular order. DNA is found in the nucleus of every human cell, where it is organized into larger structures called chromosomes. Humans have 23 pairs of chromosomes (one set from each parent). The composite of all the DNA in a human cell is called the human genome, and it is functionally organized into units called genes. A gene codes for a particular protein, although the relationship is a little more complicated than just that. The ultimate goal of the genome is to make proteins - proteins can control biological reactions, make and support structures, alter gene expression, and have many other jobs. A gene is located at a particular locus on the genome, and there can be multiple copies of that gene (referred to as alleles).
Of particular interest in the field of genetics these days are genetic polymorphisms. A polymorphism is a variable region of DNA in which the rarest variant cannot be maintained by mutation alone. What this means is that polymorphisms arise through mutations in the genome, and then are maintained as they are passed along to offspring. The simplest kind of polymorphism is the single nucleotide polymorphism (SNP). A SNP is a single nucleotide in the long chain of DNA that is variable between different people. Polymorphisms can alter the structure of proteins, making them either more or less functional. They can also affect the expression level of the protein product.
While polymorphisms tend to not be deterministic (meaning that they do not directly cause something to happen), they can modify or increase susceptibility to a particular disease. For example, a particular copy of the ApoE gene (which is involved in lipid transport and metabolism) can increased your likelihood of developing Alzheimer's disease.
One last thing I wanted to add is that genes do not necessarily determine the function of the organism. Just because you may have a family history of heart disease does not mean that you will die of heart disease. The environment is a large source of variation as well, and it can interact with genes through a mechanism called epigenetics to alter gene products. In almost all cases, you can overcome diseases you are genetic susceptible to if you just maintained a healthy environment around you.
Thursday, January 10, 2013
Let's try this again, Shall we?
It appears that I did not keep up with my blog as planned. Instead, I fell victim to the harsh realities of the "real world": school overtook my free time. Well, I think I'm going to take a second go at this blog. I'm taking some time off school for medical reasons, and I think this is a perfect opportunity for me to get this blog off the ground, so to speak.
So please, bear with me. Hopefully there will be some interesting content to come!
So please, bear with me. Hopefully there will be some interesting content to come!
Subscribe to:
Posts (Atom)