Sunday, 6 December 2015

Coronary Heart Disease in the UK by Jess Clarke

My friend Jess Clarke and I have done an article swap. You can check out her blog here: http://jessclarkeblogger.blogspot.co.uk


Why is there a high rate of CHD in the UK and why would there be a pattern of different areas in the UK having different rates of CHD deaths?

Coronary heart disease (CHD) is the term that describes what happens when your hearts blood supply is blocked or interrupted by a build up of fatty substances called atheroma in the coronary arteries. CHD is the most common cause of death in the UK. Each year there are 80,000 deaths from CHD. Different areas in the UK have different rates of CHD rates, for example the general trend seems to be that death rates from CHD are highest in Scotland and the North of England and lowest in the South of England.

There are many factors that increase the risk of CHD and may increase the rate of CHD in a particular country, such as age, prevalence of smoking and obesity. 
   Age can greatly affect the chances of getting CHD. The risk of CHD increases with age and as the UK has an ageing population it means that there is a higher risk of more of the population contracting CHD. In the UK 10 million people are over 65 years old and CHD is a serious concern among older adults because of the increased number of people living beyond 65 years, an age group with a high rate of CHD, and this age structure in the UK could explain why there are such high CHD rates. In the UK there are often communities and areas that have specific age groups living in them. For example there are areas that are very family orientated and so there will be many middle aged working people with children, and there are also areas that have a higher population of retired or older people. For example there are five retirement villages in Cheshire, which could explain why mortality from CHD rates are slightly higher in the north of England as older people are at a higher risk of contracting CHD and dying from a coronary event.
    Smoking is another factor that can greatly increase the risk of CHD. The World Health Organization (WHO) research estimates that over 20% of cardiovascular disease is due to smoking and the risk of mortality from CHD is 60% higher in smokers. The prevalence of smoking in the UK declined to its lowest level in 2013 with 18.7% of people over 18 being smokers.  However while this may seem like a low percentage it is still 10 million adults who smoke and the number of ex-smoker exceeds this number. Ex-smokers still have an increased risk of getting CHD for at least 15 years after stopping, so even though less people are currently smoking there are many people who are ex-smokers in the UK and this again could explain why the UK has a high rate of CHD. Not only does smoking increase the risk of CHD for the person actively smoking but also regular exposure to passive smoking increases CHD risk by up to 25-30%. Many people are regularly exposed to passive smoking in public and so may have an increased risk, but people living with smokers who experience passive smoking very frequently have an increased risk of death from CHD of 50-60%. This explains why the UK has high CHD rates, as the UK is a western country with a relatively high number of smokers, ex-smokers and also a large number of people exposed to passive smoking on a regular basis. This could also help explain why certain areas in the UK have higher mortality rates from CHD as people living in the countryside are less likely to be exposed to smoke on a regular basis, provided they are not, or are not living with, a smoker. Whereas in large busy cities it is more likely to be exposed to passive smoking frequently, therefore these urban areas may have higher CHD rates, and mortality from CHD rates.  
  Poor nutrition can also result in an increased risk of CHD. CHD risk is related to cholesterol levels, which can be increased by ingesting too many fats, in particular saturated fats, which block up blood vessels and cause CHD.  Obesity is a major result of poor nutrition that greatly increases the risk of CHD. In the UK adult obesity rates have almost quadrupled in the last 25 years and nearly two-thirds of men and women in the UK are obese or overweight, according to new analysis data conducted by the Institute for Health Metrics and Evaluation. According to an American study women with a BMI in the category overweight had a 50% increase in risk of non-fatal or fatal coronary heart disease and men 72%. This shows that being overweight greatly increases the risk of CHD and the high numbers of obese in the UK could be increasing the rate of CHD in the UK. 
 Poor nutrition can also mean a lack of certain nutrients such as omega-3 fatty acids found in oily fish, which have been shown to reduce CHD mortality. For poorer people in the UK, living in poorer areas, fish rich in omega-3 fatty acids may be too expensive, particularly when there are much cheaper options (which are often more unhealthy and could contribute to the high obesity levels) and this could explain why certain areas have higher CHD mortality rates.
 Exercise can majorly reduce the risk of CHD however there is an extreme lack of exercise being done in the UK, which contributes to obesity and high levels of CHD. A study examining the physical activity across England found that nearly 80% of the population fails to hit key national government targets, which include performing moderate exercise for 30 minutes at least 12 times a month. And according to Professor Carol Propper “the levels of physical activity is shockingly low” in the UK. So this lack of exercise across the UK could contribute to the high rates of CHD. However the amount of exercise differs around the UK, with one of Britain’s “laziest” areas being Sandwell in the West Midlands where, for example, only one in 20 gets on a bike in any month. Compared to this there are areas such as Wokingham in Berkshire where 82% of the residents exercise at least once in every month. This could explain why different areas of the UK have higher or lower mortality rates from CHD.
  Ethnicity can also effect the risk of CHD as studies have shown that south Asian people living in the UK have a higher premature death rate from CHD, 46% higher for men and 51% higher for women. Hypotheses for this include migration, disadvantaged socio-economic status and ‘proatherogenic” diet. Migration in the UK is staying at record levels, with a net migration of 250,000 people. A percentage of these people will be from South Asia and so could be increasing the high rates of CHD in the UK. Also in certain areas in the UK different ethnicities often form communities, for example in Rugby there is a large number of Eastern Europeans living in the community. Different Ethnic communities within the UK could mean differing rates of CHD if certain ethnicities are more or less at risk of dying from CHD.
   Rates of CHD are currently high in the UK due to a number of different factors and there are certain areas in the UK where these factors may be more or less common for example within different socio-economic groups resulting in a range of mortality rates from CHD within the UK.

  

Tuesday, 24 November 2015

Should Human Cloning Be Legal? by Emily Lauterpacht


In recent years, as technology has advanced, so have our ambitions in how to use it. The first ever clone was created by Hans Adolf Edward Dreisch in 1885, by shaking a sea urchin embryo that was made up of two cells, until they separated, and eventually grew into two separate sea urchins. However, cloning, as we know it today, involves the transfer of the nucleus of one cell into another cell (usually an enucleated egg cell), and the first successful example of this was in 1952, by Robert Briggs and Thomas King. These two men cloned a frog, and found that many attempts at cloning failed. It was also found that some of the clones that did survive grew abnormally. These early difficulties continue to occur regularly, and are amongst the many reasons that the cloning of humans has been so controversial.

Frogspawn
To those who are for human cloning, it represents countless ways that our lives can be improved. It is seen to be a way in which want-to-be parents, who can’t have children, can not only have a child, but have that child be genetically related to them, and in many cases, only them. At the moment, the options for these potential parents are generally limited to adoption, use of a surrogate, in vitro fertilisation (IVF) or artificial insemination, depending on the situation, with the latter two possibly involving sperm and/or egg donors. This means that the child is usually either fully or partly not genetically related to on  or both parents. For many want-to-be parents, these options are far from ideal. The use of human cloning in these situations would mean that couples with fertility problems, or single parents, could have a child that was genetically related to them, and only them, as all of the genetic material only comes from one person, with the potential exception of mitochondrial DNA if the egg used is not that of the person being cloned.
As mentioned earlier, in cloning animals, there have been many issues with growth abnormalities. This has meant that there are very poor survival rates for clones; Dolly the sheep was the only sheep that was born alive out of 277 eggs. Although the statistics have now improved since the time of Dolly’s cloning in 1996, only 2-3 animals survive out of every 100 attempts. However, what worries many people more, is that the animals that do survive to be born have unusually large numbers of deformities. For example, many cloned lambs have breathing problems, as well as enlarged blood vessels (often up to 20 times larger than normal), which put immense amount of stress on the heart. Creating humans who are almost certain to have abnormalities similar to those experienced by cloned animals has huge ethical implications, and is considered the main reason to not legalise human cloning.

Dolly the sheep
Many of those who are for the legalisation of human cloning say that the abnormalities mentioned above can be resolved with practice, and that through learning to successfully clone humans, we will also discover a lot about the ageing process, cell development and cancer. Through developing these areas, we can therefore further our health care abilities. It is also believed that through cloning humans, we could potential use embryonic stem cells from a clone to restore sight, amongst other applications, or we could make brainless clones, which would act as organ donors if the person cloned were ever to need a transplant of any kind.  Were these possible, it would bring yet more ethical issues, which would need to be weighed up against the benefits, which would include no issues of organ rejection, and not having to rely on brain-dead or dead organ donors.
Another issue with human cloning is lack of public understanding. Many members of the public, especially those who feel that human cloning would directly benefit them (for example it may give them the possibly of have a child who is genetically related to them), strongly believe that human cloning should be legal. However, often they do not know all of the facts. For example, while they may be aware of the risk of deformities, they may not realise the potential extent of these, were they able to have a child made by cloning. They therefore blindly battle on, fighting for human cloning, without really knowing what they could get themselves into. Another reason that some people wish to clone humans is so that they may have a dead loved one cloned. Again, this may seem like a lovely idea, but it is predicted that were this to occur, it would have all sorts of detrimental psychological effects, both on the clone, and on the people whose loved one has been cloned. The clone may look like the deceased, but they would have a different personality and different life experiences. Having these feelings and expectations projected onto them as they grow up would likely have negative mental effects on the clone. It would also not bring back the dead, and this is a concept that much of the public, especially when grieving, would likely struggle with. There would also be legal issues with granting permission for the clone to be produced, as clearly, the deceased are unable to consent.

Currently, humans can be cloned if you have a licence, as long as the embryo is destroyed within 14 days, as this is when the nervous system begins to form. The strict laws that regulate this, in my opinion, strike the correct balance in this on going battle between science and ethics. It means that research into, and techniques for cloning are allowed to develop, while not risking the development of foetuses with abnormalities. I strongly believe that it would be wrong to allow the implantations of cloned embryos when there is still such a high risk of serious disabilities – both mental and physical. However, once cloning techniques have been perfected in animals and at these early stages, I believe I will reconsider my stance on the laws on cloning, as these unnecessary and life altering deformities are the part of human cloning that I currently contest.

Emily Lauterpacht

Friday, 13 November 2015

Could fire exist without life? - follow up by Emily Lauterpacht

A few weeks ago I wrote about whether fire could exist without life. In the article (read it here), I concluded that fire would not exist if life didn't, but some recent reading has led me to realise that I was wrong in saying that "fire ... would not be possible, as ... any form of fuel on earth is either alive, has once been alive, or has been modified by humans from something that once was alive". 
Comet Lovejoy has been discovered to be being emitted ethyl alcohol, amongst other organic molecules. In fact, at its peak activity, Comet Lovejoy has been releasing as much alcohol as in at least 500 bottles of wine per second! Some of these organic molecules are flammable, and so, in theory, we have a fuel for fire that exists without life. However, I still stand by my statement that fire couldn't exist without life, as without life, there is still not a sufficient source of oxygen to allow a fire to start.

Comet Lovejoy


Emily Lauterpacht

Tuesday, 3 November 2015

Faecal Microbiota Transplants by Emily Lauterpacht

Faecal microbiota transplants: a disgusting yet intriguing idea, which has taken off in the last few years. 


This seemingly unhygienic practice came into being after a landmark study was published in 2006, about the gut bacteria of mice. It was shown that the gut bacteria of fat and thin mice differ greatly. It also appeared that this was similar to in humans, as when gut bacteria from a thin human was given to a fat mouse, it lost weight, even on the same diet, and if a thin mouse was given fat human gut bacteria, the mouse got fat. 
Soon after this study, this area of research exploded, and soon scientists had found links between the microbes found in human guts and obesity, colon cancer, rheumatoid arthritis, allergies and diabetes. Connections have also been found in mice between their gut bacteria and both depression and multiple sclerosis. 
These discoveries have therefore given rise to faecal microbiota transplants (FMT). The main use for them currently (since 2013) is as a treatment for patients who have been infected with antibiotic resistant Clostridium difficile (C. Difficile). This has led to the survival rate increasing from fewer than a third to 94%.  FMT is now also being used experimentally to treat other conditions (mainly gastrointestinal diseases and neurological conditions) , such as IBS, colitis, chronic diarrhoea, Chronic Fatigue Syndrome and Parkinson's, and now at least 10,000 people in the West have had FMTs. 
This rapidly climbing number has caused  Doctors at the Massachusetts Institute of Technology to open the first and largest "stool bank" in the world, OpenBiome. Here, donors' faeces are screened, anonymised, given code names (for example Professor Dumpledore and Vladimir Pootin), and then usually shipped to hospitals or other institutes to be studied. 
To find out more about FMT and OpenBiome, see their website here

Emily Lauterpacht

Monday, 26 October 2015

Could fire exist without life? by Emily Lauterpacht

We are taught fire needs three things (the combustion triangle): heat, oxygen and fuel. Watching a candle got me thinking about whether fire could exist without life. 


A heat source may come from many places, the most likely natural one being lightning, but direct sunlight may also start a fire. Neither of these require the existence of life, so this would not be a problem if it did not exist. 

Oxygen existed only in very small quantities until organisms began photosynthesising (possibly as early as 3.5 billion years ago). It is believed tiny amounts of oxygen were produced through water decomposing in the very upper atmosphere, as the hydrogen would escape into space, leaving the oxygen. However, it is unlikely this would even make up 1% of the atmosphere, let alone the 12% needed in the air to start a fire. On a side note, it is believed that without life, the atmosphere would be mostly made up of carbon dioxide and water vapour (percentages would depend on the Earth's temperature). 

Were enough oxygen somehow able to accumulate without life, fire still would not be possible, as, as far as I'm aware, any form of fuel on earth is either alive, has once been alive, or has been modified by humans from something that once was alive. 

Emily Lauterpacht

Sunday, 18 October 2015

A Lizard and a Guinea Pig by Emily Lauterpacht


From a young age we are taught that, unlike us, reptiles, amphibians and fish are "cold-blooded". A recent discussion in class left me wondering about this difference.


Humans, along with other mammals and birds, are endotherms, which means that they keep their body at a metabolically favourable temperature through internal bodily functions.
Reptiles, amphibians and fish are ectotherms, and rely on environmental heat sources to regulate their body temperatures.

The question arose: if an ectotherm (a lizard was our example) and an endotherm (a guinea pig) where put in a freezer, which would die first? As we didn't want to discover the answer experimentally, I decided to use my prior knowledge and some research to come up with an answer.

Certain ectotherms have a special adaption, which allows them to enter a state of torpor. This is when an animal decreases its physiological activity, usually through reducing its metabolic rate and body temperature. This can last for a large range of time: anything from a few hours to a few years. If the ectotherm placed in the freezer was able to enter a state of torpor, there is little doubt that it would out-live its "warm-blooded" counterpart.

If the ectotherm was unable to enter a state of torpor, I believe it is still likely that it would survive longer than the endotherm. When we, as humans, get cold, we start shaking to keep warm, which uses a lot of energy. If we can't move to somewhere warmer soon, we will eventually run out of energy and die. This would be the same in other endotherms, like the guinea pig. The guinea pig's body temperature will probably also be at a higher temperature than the lizard's, and so will be more different to the external temperature of the freezer. This will cause the endotherm to lose heat more quickly than the ectotherm. 

Therefore, in answer to the question, a guinea pig would probably die in a freezer before a lizard. 





Emily Lauterpacht

Wednesday, 7 October 2015

Nobel Prizes by Emily Lauterpacht


In the last few days, this year's Nobel Prizes for Chemistry and Physiology or Medicine have both been announced. As they both have biological relevance, I decided to find out more about this year's winners and their discoveries.


Nobel Prize for Chemistry

This year's Nobel Prize for Chemistry is equally shared between 3 scientists: Tomas Lindahl, Paul Modrich and Aziz Sancar. They have been awarded it "for mechanistic studies of DNA repair". Essentially, this means they have mapped at a molecular level and explained how cells repair their DNA, protecting this genetic information. The three scientists worked independently, and their individual findings and discoveries are documented below. 

Tomas Lindahl: Whilst heating RNA, Lindahl saw that the molecule rapidly degraded and started to wonder whether, if RNA was so sensitive, DNA was stable for a lifetime. He estimated that there must be thousands of defects created in the DNA every day, but if this was so, reasoned that humans could not exist as they do. He therefore concluded that something must be repairing the DNA to allow humans to survive. 
As learnt at GCSE, the nitrogenous base cytosine normally pairs with guanine through complementary base pairing in DNA. However cytosine can easily lose an amino acid group, and when this occurs, the base tends to pair with adenine. Lindahl realised that something must be protecting against this, and then in 1974 was able to identify an enzyme in bacteria that removed damaged cytosine from DNA.The process, now known as base excision repair, also occurs in humans, and in 1996, Lindahl recreated the process of human repair in vitro. 


http://www.nobelprize.org/nobel_prizes/chemistry/laureates/2015/popular-chemistryprize2015.pdf
Aziz Sancar: It has long been known that DNA can be damaged by UV radiation. Aziz Sancar became particularly interested by bacteria, which, when exposed to deadly doses of UV radiation, can recover if illuminated with visible blue light. In 1976, Sancar cloned the gene for the enzyme photolyase, which repairs UV-damaged DNA. However, it became clear that bacteria have a second system for repairing UV damage to their DNA that functioned in the dark, which Sancar's colleagues at Yale had been studying since the mid 60s. He was successful in identifying, isolating and characterising the enzymes that were coded for in 3 UV-sensitive strains of bacteria that carried three different genetic mutations (known as uvrA, uvrB and uvrC). Sancar then continued on to carry out in vitro experiments, showing that these enzymes were able to identify and remove DNA damage by UV. This process became known as nucleotide excision repair, and Sancar proceeded to investigate it in humans, finding that although it was more complex, it was chemically similar. 


http://www.nobelprize.org/nobel_prizes/chemistry/laureates/2015/popular-chemistryprize2015.pdf

Paul Modrich: During Modrich's postdoc, he began to examine DNA ligase, DNA polymerase and Eco RI, enzymes that affect DNA, but soon came across Dam methylase, which attaches methyl groups to adenine in DNA. He showed that these methyl groups functioned as signposts so that a particular restriction enzyme cut the DNA in the correct location. Working with Matthew Meselson, Modrich proved that these methyl signposts were marking that a DNA strand was not defective, and so did not need repairing. This became known as DNA mismatch repair. From here, Modrich systematically cloned and mapped many enzymes work in the mismatch repair process, and towards the end of the 80s, he was able to recreate the mechanism in vitro. We still don't know how human mismatch repair works, as methylation has a different function in our genome, but thanks to Paul Modrich's work, we are one step closer. 


http://www.nobelprize.org/nobel_prizes/chemistry/laureates/2015/popular-chemistryprize2015.pdf


Nobel Prize for Physiology or Medicine

The Nobel Prize for Physiology or Medicine also went to three scientists this year.  Half on the prize went to Youyou Tu, "for her discoveries concerning a novel therapy against Malaria", and the other half was equally split between William Campbell and Satoshi Omura, "for their discoveries concerning a novel therapy against infections caused by roundworm parasites". Their research has led to drugs that have treated diseases affecting more than 3.4billion people around the world. 

William Campbell and Satoshi Omura: In 1974, Omura isolated strains of a group of soil bacteria called Streptomyces, that were previously known to have antimicrobial properties. He then sent the organisms to Campbell, who isolated avermectins, a class of compounds that kill parasitic roundworms, from the bacterial cultures. After slightly modifying one of the compounds isolated, the drug Ivermectin was developed, and then released onto the market in 1981. Through killing roundworms, the drug has radically lowered the incidence of River Blindness and Lymphatic Filariasis (Elephantiasis).


From left to right: William Campbell,  Satoshi Omura, Youyou Tu

Youyou Tu: In 1967, China set up a national project, with the aim to discover new therapies against malaria. Tu and her team studied over 2000 traditional herbal remedies from China, in the hope of finding one with antimalarial properties. They discovered that an extract from he wormwood plant Artemisia annua was particularly effective. In 1972, a chemically our compound known as artemisinin was isolated, which was developed into a drug. Artemisinin significantly reduces the mortality rates of patients suffering from Malaria, and it is partly down to the work of Tu that malaria rates are down 75%. 

Emily Lauterpacht