Diego V. Bohorquez, PhD
- Assistant Professor in Medicine
- Assistant Professor in Pathology
- Assistant Research Professor in Neurobiology
- Faculty Network Member of the Duke Institute for Brain Sciences
However prostate cancer jewelry buy discount flomax 0.4 mg on line, on a longer time span of 3600 years prostate cancer 9 gleason score flomax 0.2mg fast delivery, pulses of increased landslide activity in the Swiss Alps have been linked to phases of deforestation (Dapples et al mens health august 2012 buy cheap flomax 0.2mg on-line. Fire prostate foods to avoid discount flomax 0.4mg mastercard, whether natural or man-induced prostate yourself before god buy flomax online now, can be a major cause of slope instability and debris flow generation by removing or reducing protective vegetation prostate cancer 1-10 flomax 0.2mg for sale, by increasing peak stream flows, and by leading to larger soil moisture contents and soil-water pore pressures (because of reduced interception of rainfall and decreased moisture loss by transpiration) (Wondzell and King, 2003). However as with so many environmental changes of that nature in the past, there are considerable difficulties in being certain about causation. This has been well expressed by Ballantyne (1991: 84): Although there is growing evidence for Late Holocene erosion in upland Britain, the causes of this remain elusive. A few studies have presented evidence linking erosion to vegetation degradation and destruction due to human influence, but the validity of climatic deterioration as a cause of erosion remains unsubstantiated. Indeed, in many reported instances, it is impossible to refute the possibility that the timing of erosional events or episodes may be linked to high magnitude storms of random occurrence, and bears little relation to either of the casual hypotheses outlined above. The complexity and diversity of causes of stream channel change is brought out in the context of Australia (Table 6. Major changes in the configuration of channels can be achieved accidentally (Table 6. First, it eliminates some overbank floods on the outside of curves, against which the swiftest current is thrown and where the water surface rises highest. Second, and more importantly, the resultant shortened course increases both the gradient and the flow velocity, and the floodwaters erode and deepen the channel, thereby increasing its flood capacity. It was for this reason that a programme of channel cut-offs was initiated along the Mississippi in the early 1930s. By 1950 the length of the river between Memphis, Tennessee and Baton Rouge, Louisiana (600 km down the valley) had been reduced by no less than 270 km as a result of 16 cut-offs. Some landscapes have become dominated by artificial channels, normally once again because of the need for flood alleviation and drainage. The causes of observed cases of riverbed degradation are varied and complex and result from a variety of natural and human changes. A useful distinction can be drawn between degradation that proceeds downstream and that which proceeds upstream, but in both cases the complexity of causes is evident. Deliberate channel straightening causes various types of sequential channel adjustment both within and downstream from straightened reaches, and the types of adjustment vary according to such influences as stream gradient and sediment characteristics. Brookes (1987) recognized five types of change within the straightened reaches (types W1 to W5) and two types of change downstream (types D1 and D2). Type W1 is degradation of the channel bed, which results from the fact that straightening increases the slope by providing a shorter channel path. Type W2 is the development of an armoured layer on the channel bed by the more efficient removal of fine Figure 6. Type W3 is the development of a sinuous thalweg in streams which are not only straightened but which are also widened beyond the width of the natural channel. Type W4 is the recovery of sinuosity as a result of bank erosion in channels with high slope gradients. Type W5 is the development of a sinuous course by deposition in streams with a high sediment load and a relatively low valley gradient. Types D1 and D2 result from deposition downstream as the stream tries to even out its gradient, the deposition occurring as a general raising of the bed level, or as a series of accentuated point bar deposits. It is now widely recognized that the urbanization of a river basin results in an increase in the peak flood flows in a river (Bledsoe and Watson, 2001). It is also recognized that the morphology of stream channels is related to their discharge characteristics and especially to the discharge at which bank full flow occurs. As a result of urbanization, the frequency of discharges which fill the channel will increase, with the effect that the beds and banks of channels in erodible materials will be eroded so as to enlarge the channel (Trimble, 1997). This in turn will lead to bank caving, possible undermining of structures and increases in turbidity (Hollis and Luckett, 1976). Other changes include decreases in channel sinuosity and increases in bed material size (Grable and Harden, 2006). Trimble (2003) provides a good historical analysis of how the San Diego Creek in Orange County, California, has responded to flow and sediment yield changes related to the spread of both agriculture and urbanization. Changes in channel morphology also result from discharge diminution and sediment load changes produced by flood-control works and diversions for irrigation (Brandt, 2000). The tendency of both rivers has been to form one narrow, well-defined channel in place of the previously wide, braided channels, and, in addition, the new channel is generally somewhat more sinuous than the old (Figure 6. Similarly, the building of dams can lead to channel aggradation upstream from the reservoir and channel deepening downstream because of the changes brought about in sediment loads (Figure 6. Such changes in channel form result from discharge diminution (c) caused by flood-control works and diversions for irrigation (modified from Schumm, 1977, figure 5. However, over time, the rate of degradation seems to become less or to cease altogether, and Leopold et al. First, because degradation results in a flattening of the channel slope in the vicinity of the dam, the slope may become so flat that the necessary force to transport the available materials is no longer provided by the flow. Second, the reduction of flood peaks by the dam reduces the competence of the transporting stream to carry some of the material on its bed. Thus if the bed contains a mixture of particle size the river may be able to transport the finer sizes but not the larger, and the gradual winnowing of the fine particles will leave an armour of coarser material that prevents further degradation. The overall effect of the creation of reservoir by the construction of a dam is to lead to a reduction in downstream channel capacity (see Petts, 1979, for a review). In recent years, partly because some old dams have become unsafe or redundant, they have been demolished. Among the consequences of dam removal are incision into the sedimentary fill that had accumulated in the reservoir behind the dam, migration of a knickpoint upstream, deposition of liberated sediment and the formation of bars and the like downstream (Major et al. However, the precise consequences vary greatly depending on the nature and erodibility of the fill and the regime of the river. Equally far-reaching changes in channel form are produced by land-use changes and the introduction of soil conservation measures. The phase of intense erosive land use persisted and was particularly strong during the nineteenth century and the first decades of the twentieth century, but thereafter (Figure 6. Streams ceased to carry such a heavy sediment load, they became much less turbid, and incision took place into the flood-plain sediments. Channel widening upstream of dam but incision limited by original channel armouring Rumschlag and Peck Muroe, Ohio 3. In the south-west Wisconsin a broadly comparable picture of channel change has been documented by Knox (1977). There, as in the Upper Mississippi Valley (Knox, 1987), it is possible to identify stages of channel modification associated with various stages of land use, culminating in decreased overbank sedimentation as a result of better land management in the last half century. Other significant changes produced in channels include those prompted by accelerated sedimentation associated with changes in the vegetation communities growing along channels. It is a mistake to assume that because this is generally less dense in deserts than elsewhere, it is of no consequence. In fact, phreatophytes such as tamarisk, cottonwood and willow, which draw their sustenance from groundwater at depth, can significantly influence channel geometry by increasing bank resistance to erosion, inducing deposition and increasing roughness, and by taking up so much water that discharge is reduced. Such vegetation can lead to significantly reduced channel width and, as a result, to increased overbank discharge (flooding) (Birken and Cooper, 2006). Hadley (1961) demonstrated a similar effect on an arroyo in northern Arizona, and Graf (1981) demonstrated that in the ephemeral Gila River channel, sinuosity was increased from 1. Tamarisk, because of its root system, is capable of resisting the hydraulic stresses of flash-floods. It is also drought tolerant and has an ability to produce denser stands than native species, which gives it a competitive advantage. There is, however, a major question about the ways in which different vegetation types affect channel form (Trimble, 2004). On the one hand tree roots stabilize banks and their removal might be expected to cause channels to become wider and widening and shallower (Brooks and Brierly, 1997). On the other hand, forests produce log-jams which can cause aggradation or concentrate flow onto channel banks, thereby leading to their erosion. Another organic factor which can modify channel form is the activity of grazing animals. These can break the banks down directly by trampling and can reduce bank resistance by removing protective vegetation and loosening soil (Trimble and Mendel, 1995). Trimble, Man-induced soil erosion on the southern Piedmont, Soil Conservation Society of America. Finally, the addition of sediments to stream channels by mining activity can cause channel aggradation. Equally, the mining of aggregates from river beds themselves can lead to channel deepening (Bravard and Petts, 1996; Rinaldi et al. Reactivation and stabilization of sand dunes To George Perkins Marsh the reactivation and stabilization of sand dunes, especially coastal dunes, was a theme of great importance in his analysis of human transformation of nature. His analysis showed quite clearly that most of the coastal dunes of Europe and North America had been rendered mobile, and hence a threat to agriculture and settlement, through human action, especially because of grazing and clearing. In Britain the cropping of dune warrens by rabbits was a severe problem, and a most significance event in their long history was the myxomatosis outbreak of the 1950s, which severely reduced the rabbit population and led to dramatic changes in stability and vegetative cover. Appreciation of the problem of dune reactivation on mid-latitude shorelines, and attempts to overcome it, go back a long way (Kittredge, 1948). For example, the menace of shifting sand following denudation is recognized in a decree of 1539 in Denmark which imposed a fine on those who destroyed certain species of sand plants on the coast of Jutland. The fixation of coastal sand dunes by planting vegetation was initiated in Japan in the seventeenth century, while attempts at the re-afforestation of the spectacular Landes dunes in south-west France began as early at 1717 and came to fruition in the nineteenth century through the plans of the great Bremontier: 81,000 hectares of moving sand had been fixed in the Landes by 1865. In Britain possibly the most impressive example of sand control is provided by the re-afforestation of the Culbin Sands in north-east Scotland with conifer plantations (Edlin, 1976). Human-induced dune instability is not, however, a problem that is restricted to mid-latitude coasts. In inland areas of Europe, clearing, fire and grazing have affected some of the late Pleistocene dune fields that were created on the arid steppe margins of the great ice sheets, and in eastern England the dunes of the Breckland presented problems on many occasions. However, it is possibly on the margins of the great subtropical and tropical deserts that some of the strongest fears are being expressed about sand-dune reactivation. The increasing population levels of both humans and their domestic animals, brought about by improvement in health and by the provision of boreholes, has led to an excessive pressure on the limited vegetation resources. The problem is not so much that dunes in the desert cores are relentlessly marching on to moister areas, but that the fossil dunes, laid down during the more arid phase peaking around 18,000 years ago, have been reactivated in situ. In practice most solutions to the problem of dune instability and sand blowing have involved the establishment of a vegetation cover. Species used to control sand dunes must be able to endure undermining of their roots, burning, abrasion and often severe deficiencies of soil moisture. Thus the species selected need to have the ability to recover after partial growth in the seedling stages, to promote rapid litter development and to add nitrogen to the soil through root nodules. During the early stages of growth they may need to be protected by fences, sand traps and surface mulches. In hearts of deserts sand dunes are naturally mobile because of the sparse vegetation cover. Even here, however, humans sometimes attempt to stabilize sand surfaces to protect settlements, pipelines, industrial plant and agricultural land. The use of relatively porous barriers to prevent or divert sand movement has proved comparatively successful, and palm fronds or chicken wire have made adequate stabilizers. Elsewhere surfaces have been strengthened by the application of high-gravity oil or by salt-saturated water, which promotes the development of wind-resistant surface crusts. Frequently these techniques are not particularly successful and very often the best solution is to site and design engineering structures to allow free movement of sand across them. Various studies of the comparative effectiveness of different stabilization techniques have been undertake in recent years. This finding was confirmed by a study in the Kerqin Sandy land of northern China (Li et al. Along a major highway in the Taklamakan desert, checkerboards, reed fences and nylon nets were found to be effective (Dong et al. In temperate areas coastal dunes have been effectively stabilized by the use of various trees and other plants (Ranwell and Boar, 1986). In Japan Pinus thumbergii has been successful, while in the great Culbin Sands plantations of Scotland P. Of the smaller shrubs, Hippophae has proved highly efficient, sometimes too efficient, at spreading. Its clearance from areas where it is not welcome is difficult precisely because of some of the properties that make is such an efficient sand stabilizer: vigorous suckering growth and the rapid regrowth of cut stems (Boorman, 1977). Different types of grass have also been employed, especially in the early stages of stabilization. These include two grasses which are moderately tolerant of salt: Elymus farctus (sand twitch) and Leymus arenarius (lime grass). Another grass which is much used, not least because of its rapid and favourable response to burial by drifting sand, is Ammophila arenaria (marram). This raised the dune height approximately 4 m over a period of 6 years (after Savage and Woodhouse in Goldsmith, 1978, figure 36, with permission of the American Society of Civil Engineers).
- Increased growth on only one side of the body
- MRI scans
- Medicines to treat pain
- Green and red peppers
- Severe infection from bacteria or other germs that do not usually cause severe infection
The portal tracts are infiltrated with inflammatory cells (lymphocytes mens health june 2013 purchase flomax with american express, macrophages mens health watches order flomax on line, plasma cells) prostate cancer death rate order flomax line. The infiltrate is limited to portal tracts and does not spill out into the hepatic parenchyma androgen hormone 1 order generic flomax canada. Pathologically mens health look book purchase generic flomax line, it is characterised by piecemeal necrosis and fibrosis extending from the portal tracts into the hepatic parenchyma leading to cirrhosis man health daily relationships category buy flomax 0.4 mg on line. Chronic lobular hepatitis: It refers to lobular inflammation with spotty necrosis. Alpha interferon Prednisone, azathioprine withdraw drug Prednisolone Azathioprine 2. It is also useful in lamivudine resistant cases and can safely be given in the presence of de-compensated liver disease. Hepatitis A vaccine: An inactivated protein vaccine (Harvix) grown in human diploid cells. Clinical Features It occurs most often in women (10-30 years and late middle age). The common symptoms are fever, fatigue, intermittent jaundice, weight loss, and pruritus. Liver Abscess Liver is the organ commonly involved in the development of abscesses. In developing countries, abscesses are due to parasitic infection (amoebic, echinococcal, other protozoal or helminthic organisms). Organisms reaching the liver via the portal vein (amoebiasis, appendicitis, actinomycosis of right iliac fossa). Via arterial supply (septicaemia, pyaemia, faciocervical actinomycosis, infected hydatid cyst). Sterile pleural effusions Contiguous spread from liver Frank rupture into pleural space Hepatobronchial fistula (good prognosis) Rupture into peritoneum, pericardium (grave prognosis). Water sanitation; take fruits and vegetables after washing and after removing the skin. These drugs are given along with a luminal agent (paromomycin 500 mg tid for 10 days or diloxanide furoate 500 mg tid for 10 days). In abscesses following pelvic/ intraperitoneal source of infection, anaerobes or mixed flora are common. It may end up with fibrosis and end stage liver disease in the absence of significant alcohol consumption (Figs 5. Hepatic venous outflow obstruction Veno-occlusive disease, Budd-Chiari syndrome Constrictive pericarditis Figs 5. The degeneration causes fibrosis followed by regeneration resulting in the formation of nodules. Signs of liver cell failure, parotid and lacrimal gland enlargement and clubbing of fingers occur. Importance of Platelet count Progressive decline in platelet count is an important marker and the first clue for the evolution of cirrhosis in a patient with chronic liver disease. Alcoholic Cirrhosis It is characterised by diffuse fine scarring, fairly uniform loss of liver cells and small regenerative nodules. Prolonged serum prothrombin time due to reduced synthesis of vitamin K dependent clotting factors. Serum albumin is depressed and serum globulins are increased due to impaired protein synthesis by liver. Leucopenia and thrombocytopenia due to hypersplenism and due to the effect of alcohol on the bone marrow. Other abnormalities include hypomagnesaemia, hypophosphataemia, hyponatraemia, hypokalaemia and respiratory alkalosis. Ultrasonography to find liver size and obstructive disorders of hepatobiliary tree. Drugs must be administered with caution as almost all the drugs undergo metabolism through liver. Clinical features, diagnosis and treatment are almost similar to alcoholic cirrhosis. Primary biliary cirrhosis is characterised by chronic inflammation and fibrous obliteration of intrahepatic bile ducts. Secondary biliary cirrhosis is characterised by partial or complete obstruction of larger extrahepatic bile ducts. The autoantigen commonly involved is 74-kDa E2 component of pyruvate dehydrogenase complex. Other symptoms include jaundice, fatigue, melanosis, steatorrhoea, malabsorption of fat soluble vitamins, elevation of serum lipids resulting in xanthelasma and xanthomas. Fever and right upper quadrant pain (cholangitis/biliary colic) may occur in secondary biliary cirrhosis. Two- to five-fold increase in serum alkaline phosphatase and elevation of serum 5 nucleotidase are seen. There may be an increased titre of more than 1: 40 of antimitochondrial antibody. Postnecrotic Cirrhosis It is characterised by extensive loss of liver cells, stromal collapse and fibrosis resulting in broad bands of connective tissue containing the remains of portal triads and irregular nodules of regenerating hepatocytes. Hepatitis B, C viral infections (especially among homosexuals and intravenous drug abusers). Alcoholic and primary biliary cirrhosis leads to postnecrotic cirrhosis in later stages. Cryptogenic Cirrhosis the diagnosis of cryptogenic cirrhosis is reserved for those patients in whom no aetiology can be demon- 320 Manual of Practical Medicine. Liver biopsy Stage 1 Necrotising inflammatory process (acute and chronic inflammatory cells) of the portal triads with destruction of medium and small sized bile ducts Stage 2 Ductule proliferation Stage 3 Stage 4 Expansion of periportal fibrosis due to scarring. Methotrexate in a low dose and cyclosporine are used to slow the progression or arrest the disease. Ursodiol 13 to 15 mg/kg/day is shown to produce symptomatic improvement and improvement in serum biochemical parameters. Symptomatic treatment includes antipruritic agents and cholestyramine 8 to 12 gm/day for pruritus and hypercholesterolaemia. Secondary biliary cirrhosis is treated by surgical means or endoscopic relief of the obstruction. Portal hypertension Ascites Hepatic encephalopathy Spontaneous bacterial peritonitis Hepatorenal syndrome Hepatocellular carcinoma Coagulopathy Hepato-pulmonary syndrome Malnutrition Bone disorders-osteopenia, osteoporosis, osteomalacia 11. Cardiac Cirrhosis Aetiology Prolonged severe right sided congestive heart failure may lead to chronic liver injury and cardiac cirrhosis. Pathogenesis In chronic right heart failure, retrograde transmission of elevated venous pressure leads to congestion of liver. Portal hypertension is present when the sustained elevation of portal pressure is > 10 mm of Hg but the risk of variceal bleeding is greater only when it is > 30 cm saline or > 12 mm of Hg. Clinical Features Patients may present with any of the complications of portal hypertension namely 1. Congenital hepatic cirrhosis fibrosis 322 Manual of Practical Medicine sation or through transjugular cannulation of the hepatic veins. Wedged hepatic venous pressure is high in sinusoidal and postsinusoidal portal hypertension. Portal venogram: Site and the cause of portal venous obstruction can be detected and is also performed prior to surgical therapy. Maintenance of portal hypertension after the collaterals are formed, is attributed to a resultant increase in splanchnic blood flow. Beta-blockers like propranolol or nadolol can be used due to their vasodilatory effects on both the splanchnic arterial bed and the portal venous system in combination with reduced cardiac output. Propranolol prevents recurrent bleeding from severe portosystemic gastropathy in cirrhotic patients. Treatment of alcoholic hepatitis, chronic active hepatitis and other diseases results in fall in portal venous pressure and reduction in variceal size. Oesophageal and gastric varices (left gastric vein and short gastric vein join with intercostal, diaphragmatic, oesophageal and azygos veins of the caval system). Hemorrhoids (Superior haemorrhoidal vein of the portal system to middle and inferior haemorrhoidal veins of the caval system). Caput medusae (remnants of the umbilical circulation of the foetus present in the falciform ligament may form a large paraumbilical vein). Other sites of anastomoses are retroperitoneal veins, lumbar veins, omental veins and veins over bare area of the liver. Variceal Bleeding Variceal bleeding occurs when portal venous pressure is more than 12 mm Hg. Replacement of blood, coagulation factors by fresh, frozen plasma (in coagulopathy). Terlipressin: Terlipressin can be used as a better alternative to vasopressin in the control of acute 5. Somatostatin: It is a direct splanchnic vasoconstrictor (250 g bolus followed by constant infusion of 250 g/hr is as effective as vasopressin). Short acting nitrates (nitroglycerin) via transdermal (10 mg every 12 hours), sublingual (0. They reduce peripheral vasospastic effects of vasopressin and lower the portal pressure further via direct vasodilation of portal-systemic collaterals. Complications like aspiration pneumonitis, oesophageal rupture are common depending on the length of time the balloon is kept inflated. Endoscopic sclerotherapy can be done using sclerosants like sodium morrhuate, absolute alcohol, tetradecyl, ethanolamine oleate, etc. After control of bleeding, sclerotherapy has to be continued for several weeks to months till the varices are fully obliterated. Surgery: Creation of portal systemic shunt to permit decompression of portal system. Selective shunts decompress only the varices allowing blood flow to the liver itself. No prophylactic shunt surgery or sclerotherapy should be done on patients with non-bleeding varices. Features Ascitic fluid protein Serum-ascitic fluid Albumin gradient Specific gravity Transudate < 25 gm/L > 1. Liver transplantation: It is curative for portal hypertension (not in the acute setting of variceal bleed) and should be reserved for patients with advanced liver disease. Pathogenesis Ascites occurs because of the imbalance between the formation and resorption of peritoneal fluid. Elevated plasma vasopressin and epinephrine levels in response to a volume-depleted state, accentuates renal and vascular factors. Portal hypertension is not associated with ascites unless there is concomitant hypoalbuminaemia. Tumour Prognosis Forty to seventy per cent of those bleeding from varices for the first-time die. Organisms Coliforms, streptococci, Campylobacter; usually infection is blood-borne. Cultures are more likely to be positive when 10 ml of ascitic fluid is inoculated into two culture bottles at the bed side. If more than two organisms are identified in culture, secondary bacterial peritonitis due to perforation should be considered. Chylous Ascites the fluid is milky, creamy and turbid due to the presence of thoracic or intestinal lymph. Sudan staining of fat globules microscopically and increased triglyceride content (> 1000 mg/dL) by chemical examination clinches the diagnosis. However, triglyceride concentration of > 200 mg/dL is sufficient for the diagnosis. Mucinous Ascites Occurs in pseudomyxoma peritonei or colloid carcinoma of stomach or colon with peritoneal implants. The conditions contributing to refractory ascites resulting in worsening of the primary liver disease are: a. Dietary sodium restriction and diuretics should be continued to prevent rapid reaccumulation of ascitic fluid. Albumin infusion is very costly and its replacement after large paracentesis remains controversial. Liver transplantation: the 12 months survival of patients with ascites refractory to medical therapy is only 25%. Plain abdomen X-ray: Demonstrates haziness of the abdomen with loss of psoas shadow.
These advances not only directly benefit human health prostate cancer herbal treatment flomax 0.2mg otc, but reduce the indirect cost of disease on livelihoods mens health 9 best teas discount flomax online master card, and bolster food security (and thus health) by reducing the burden of disease in livestock mens health 60 day transformation generic flomax 0.2 mg without a prescription. Some of the most significant infectious diseases driving global mortality have been eradicated or substantially reduced as part of changing land-use patterns in developing countries prostate cancer oncology buy flomax 0.2mg without prescription. Efforts to reduce disease risk through land conversion have had substantial impacts on global disease burden prostate surgery cheap flomax amex, as in the reduction or eradication of malaria in many temperate zones via the in-filling of lakes and wetlands or via severe alterations like dredging or the construction of "mosquito ditches" (Hambright & Zohary androgen hormone with pcos order flomax american express, 1998; Rozsa, 1995; Willott, 2004). Such a transition may be associated with increases in quality of life for those groups of people who manage to successfully benefit from the transition (possibly via better access to a market economy or increased production of certain goods). However, other groups that do not manage to benefit from the ecological transition may find themselves worse off than before when the safety net provided by natural ecosystems is degraded. But in the developing world, underlying disparities due to poverty and social inequality complicate disease control, and often produce idiosyncratic interactions with land-use changes and environmental degradation (see Figure 5. Diseases that are comparatively treatable or eradicated in developed countries can be particularly unmanageable in degraded ecosystems, especially where humans live in close proximity to waterways, forests, or other landscape features that increase pathogen exposure from vectors or reservoirs. Furthermore, land degradation often drives short-term declines in health by disturbing the environment and releasing pathogens, in the process of advancing infrastructure that benefits human health in the long term through economic development, food security, and greater mobility and access to healthcare. In this way, the relationship between human health and the environment can have complicated trade-offs at different scales, including through the immediate relationship of any given human with their surroundings, and in the broader feedback between environmental quality and the development and maintenance of technology, infrastructure, and other anthropogenic assets. One of the most difficult elements of land degradation impacts on human health is the role biodiversity loss plays in disease emergence, a process that, by definition, includes both entirely new pathogens and those with sudden increases in prevalence. The emergence of infectious diseases is an ecological process as well as a social one; the majority of emerging pathogens (roughly 75%) are zoonotic (originate in animals, termed reservoirs) and of those, the majority originate in wildlife (Jones et al. While many pathogens are transmitted to humans by insect vectors like mosquitoes, others are spread from wildlife reservoirs into humans through a process called spillover, which can occur directly, or indirectly propagated by livestock or domesticated animals (Johnson et al. Because of the diverse strategies that emerging pathogens can exploit, patterns of land use, agriculture, biodiversity, human-wildlife contact and human health infrastructure can interact to produce complex and often unpredictable disease dynamics (Wilcox & Colwell, 2005). On a global scale, the rate of emergence and re-emergence of infectious diseases has accelerated substantially since the industrial revolution, and continues to do so (Cohen, 2000), most likely as a consequence of global changes in climate and land use (Figure 5. Biodiversity in undisturbed ecosystems may dilute the prevalence of disease in ecosystems in ways that ultimately benefit humans; and consequently, declines in biodiversity may increase the frequency of outbreaks in wildlife (termed epizootics) that originate human outbreaks (epidemics). Higher biodiversity ecosystems can also have a greater overall richness of new pathogens that can eventually enter human populations. Biodiversity loss may therefore decrease the total richness of pathogens that humans encounter. Below, we explore that interplay deeper for three main case studies: (i) vector-borne diseases; (ii) rare episodic spillover zoonoses that originate in wildlife; and (iii) pathogens that reach human populations via livestock or agriculturallyrelated impacts. We further describe the relationship between land degradation and non-infectious diseases, in particular, noting that land degradation almost universally reduces water quality and exacerbates human exposure to pollutants, toxins, and pathogens. We conclude with an assessment of the potential impacts of land degradation and biodiversity loss on two key indirect components of clinical outcomes: the discovery of new pharmaceuticals in nature, and the role mental health plays in overall human health outcomes. Mosquito-borne diseases are particularly challenging in this regard, as development projects can increase human exposure to natural mosquito habitat (especially at the times of day Anopheles mosquitoes are most active) and produce more suitable habitat like forest edges and associated microclimates (de Castro et al. Conversion of forests into agricultural or mining land especially facilitates accumulation of standing water that exacerbates Anopheles and Aedes mosquito-borne diseases (Patz et al. Land-use changes associated with that conversion, like road building, are strongly linked in South America to workers presenting with "frontier malaria" and leishmaniasis, and in Africa to trypanosomiasis (Myers & Patz, 2009; Patz et al. However, the effects of deforestation on malaria especially are regionally variable (and likely better understood than for any other vector-borne disease), and highly dependent on local vector ecology; for example, it is likely that malaria is more strongly associated with deforestation in Africa and South America than in Asia, due to a greater richness of Anopheles species especially in southeast Asia, only some of which are ecologically specialized in such a way that they benefit from deforestation (Myers et al. Deforestation is not the only land-use change with substantial, direct links to vector-borne disease. Development projects like dam building and irrigation, which produce substantial gross benefits through employment, and energy and food security, usually produce hydrological impacts that consistently exacerbate local risk for several pathogens, especially malaria, schistosomiasis (vectored by snails), and onchocerciasis (vectored by black flies) (Morse, 2001; Patz et al. In cases like these, the downstream benefits of these projects often reach different populations than the local communities that face nearimmediate increases in overall health burdens. Further development of rural land into urban or peri-urban environments may decrease direct human contact with nature and can increase access to medical care for environmentally-mediated diseases for some people; but pre-existing health disparities, such as poor diet or access to healthcare, can severely exacerbate morbidity and mortality from urban outbreaks (Redman & Jones, 2005). Urbanization, however, also increases the risk of other vector-borne pathogens like dengue fever where water collects and Aedes mosquitoes thrive (Gubler, 2011). Other vector-borne diseases like plague or leptospirosis, which utilize rats as amplification hosts, can pose a severe risk in urban settings (Costa et al. Some evidence has suggested that cattle ownership can act as a sort of passive prophylaxis that decreases the burden of diseases like malaria, but case studies suggest that this phenomenon is inconsistent (Tirados et al. Consequently, agricultural conversion may offer a limited buffer for human health. For some pathogens, such as Japanese encephalitis or Rift Valley fever, humans living in close proximity to livestock populations actually likely increases outbreaks (Jones et al. The relationship between biodiversity and zooprophylaxis is poorly understood, but current theory indicates that land degradation-driven loss of biodiversity could substantially increase disease prevalence in wildlife and humans. This is termed the "biodiversity dilution effect", in which species richness of (usually mammal or bird) host communities corresponds to a decrease in the disease risk of pathogens. Dilution effects have been suggested as a potential factor in the outbreaks of a number of different pathogens, including Hanta virus, Lyme disease, West Nile virus and possibly Chagas disease. Whereas zooprophylaxis has little relationship to diversity, the dilution effect is conditional on high species richness and on a community structure in which additional hosts are less competent than common ones. The most competent hosts are often assumed to be the most generalist and resilient to ecological change (such as rodent pest species) and thus most resistant to biodiversity loss (and land degradation), potentially linking loss of wildlife species to increasing disease transmission risk. In the absence of data about the drivers of specific outbreaks, current scientific paradigms often recommend the maintenance of biodiversity as a buffer against disease (Civitello et al. However, the biodiversity dilution effect is still a topic of significant controversy and has been not been observed in some cases (Salkeld et al. Consequently, some studies have concluded that arguments based on other benefits of land restoration and ecosystem health are more convincing than disease dilution, as dilution may depend more on the species in a community than total richness (Randolph & Dobson, 2012). For example, studies have found that deforestation and land degradation facilitate rodent reservoirs of zoonoses in Southeast Asia (Morand et al. Similarly, evidence indicates that the sudden increase in emerging infectious disease spillover events originating in bats in West Africa and Southeast Asia is likely a product of deforestation, for agricultural purposes especially, that pushes bats into human-occupied landscapes (Jones et al. Land degradation also effects social changes that can change patterns of humanwildlife contact and thereby indirectly change patterns of zoonotic spillover. Land-use transitions have distanced many populations from sources of infectious disease, such as bushmeat (a common reservoir for viral spillover). Increasing the distance, in particular between human dwellings and livestock or wildlife, substantially reduces the direct risk of zoonotic spillover. However, land-use change can also increase the force of infection of some spillover diseases. For example, deforestation is believed to be linked to an increase in bushmeat consumption in many regions, which provides a pathway for spillover of viruses like Ebola and Marburg fever (Foley et al. While the negative impacts of land degradation on biodiversity are widely undesirable, biodiversity loss may not always be a driver of zoonotic emergence. Compared to vector-borne diseases, directly transmitted zoonoses lack a theoretically-established mechanism for a biodiversity dilution effect that would be reduced by land degradation. Moreover, strong evidence suggests higher biodiversity ecosystems have a higher overall diversity of pathogens in their zoonotic pool (Han et al. For instance, maintaining diverse ecosystems on shared grazing lands, if responsible for increasing wildlife-livestock contact, could increase spillover and spillback of diseases like anthrax, brucellosis and bovine tuberculosis (Kruse & Handeland, 2004). This could ultimately increase spillover of human disease through livestock, such as the Nipah virus spread from bats, via pigs, to humans (Pulliam et al. However, biodiversity may sometimes act as a buffer to the invasion of introduced species, and therefore the pathogens they vector or carry. Biological invasions or introductions have facilitated the majority of some classes of disease emergence (Anderson et al. Ebola still represents an unpredictable and enigmatic problem for local public health institutions. Even though the reservoir of Ebola (and many other elements of its basic biology) is controversial, recent work shows that Ebola spillover events are highly associated with hotspots of forest fragmentation due to deforestation (Rulli et al. In response to the 2014 outbreak of Ebola, some have called for an end to the bushmeat trade in West Africa as a net benefit to both human health and primate conservation, and as the simplest solution to the continued threat of disease spillover in the region. Others have criticized that approach by conservation groups as potentially "tone-deaf" (Pooley et al. Land conversion can open up new pathways for more sustainable meat production, potentially lessening financial disparities and decreasing food insecurity for poor local populations (and cutting the Gordian knot of bushmeat and Ebola). However, while deforestation and land conversion may yield a short-term benefit to bushmeat availability (at a cost to long-term availability as wildlife populations decline), that has been shown in Congo to primarily benefit non-local, nonindigenous populations (Poulsen et al. The spillover of pathogens from zoonotic reservoirs into human populations via bushmeat hunting and trade is one of the most complex avenues of infectious disease emergence, and highlights the challenging interplay of land use and development with patterns of emerging disease. Deforestation and associated development practices, such as road building and increased forest edge settlement, have the potential to significantly increase human-wildlife contact. The degree of extraction, however, also sets ecological changes in motion that in some contexts can amplify or reduce disease prevalence in reservoirs and vectors (Wolfe et al. Regional variability in demand for bushmeat, and different food preparation practices, further contribute to exposure levels. The restoration of degraded ecosystems and maintenance of biodiversity hotspots is likely to slow the spread of invading facilitators, a special case of the more general idea that ecosystem diversity contributes greatly to the maintenance of natural enemies in cultured systems (Landis et al. Consequently, from the perspective of biodiversity-disease relationships, a strong case exists for the restoration of degraded ecosystems. The latter two pathways are often correlated, and outbreaks of livestock disease can have particularly negative human health impacts on local communities by depriving them of nutrition simultaneous to disease outbreaks. Because agricultural intensification is usually related to land conversion (often deforestation), increased contact between agriculture and disturbed land often introduces zoonoses and other pathogens into human populations. Deforestation is especially common as a driver of agriculturally-linked outbreaks. Encroachment on forests alone is a particularly common disease driver for pathogens like leishmaniasis, malaria, and others; for example, farmers in deforested areas were the first to present with the rare Kyasanur forest disease (a tick borne viral disease) in India (Jones et al. More directly, livestock can act as an intermediate host through which viruses enter human populations, such as in the transmission of Nipah virus from bats to humans via pigs. Agricultural intensification, especially at fragmented ecosystem edges, can especially amplify this process. In some cases, overcrowded livestock populations offer an environment for pathogen evolution that allows otherwise-impossible spillover events, as in the possible spread of new strains of highly pathogenic avian influenza via poultry into humans. Especially in cases related to deforestation, agricultural intensification is liable to come at a cost to water quality, providing another entry point into human populations for Climate-driven land changes are likely to change the disease dynamics of the human-livestock interface in complex ways. For example, aridification is likely to increase the burden of currently neglected diseases like anthrax that are tightly associated with desert environments. The relationship between anthrax, a soil-transmitted bacterium, and different types of soil degradation is poorly understood, and livestock outbreaks with human impacts could become more common over time (though little data has been collected). However, for other classes of pathogen, especially vectorborne diseases, evidence suggests the net impact of climate change may be comparatively less than the impact of land-use change. For example, land conversion is predicted to make a far more substantial impact on the overall burden of African trypanosomiasis (a disease of both cattle and humans) than climate change (Thornton et al. Agriculture-related irrigation amplifies several classes of pathogens, especially vector-borne diseases. For example, outbreaks of Japanese encephalitis virus (a mosquito-borne illness) are driven by the interaction of irrigation and pig farming, as pigs are an amplification host that intensify human outbreaks; and irrigation has similarly been linked to outbreaks of Rift Valley fever and human fascioliasis (Jones et al. Protozoan diseases, especially cryptosporidiosis, are spread from livestock to humans when contaminated runoff enters waterways, sometimes capable of producing outbreaks in the hundreds of thousands of cases from a single storm event (Myers et al. Water contamination also poses a significant problem for the spread of drug-resistant pathogen strains at the wildlife-human-interface. Macroparasitic diseases, like parasitic worms, may be particularly favoured by "environmental nutrient enrichment" from agricultural runoff (Jones et al. Overuse of antibiotics in agriculture have produced one of the most significant modern crises in public health, driving the emergence of antibioticresistant bacteria in livestock that ultimately spill over into human populations (Witte, 1998), and a similar problem exists for the use of antibiotics in fisheries (Cabello, 2006). Drug-resistant strains often originate in sewage, as pharmaceutical compounds and their derivatives enter waterways through pollution and runoff, circulate in degraded ecosystems, and re-enter human populations via livestock. The focus of this section is primarily on water contamination and loss of regulatory bioremediation services. Neverthess, soil pollution from industrial process, while long known, is now being recognized as a key area where human land degradation is impacting human health (Brevik & Sauer, 2015). For example, in Europe it is estimated that there are 250,000 sites out of a total of 3 million that are in urgent need of remediation for heavy metal or oil pollution. The negative health impact of these sites included increased risk of cancer, kidney and bones diseases as well as neurological damage (Science Communication Unit, 2013). There is a growing body of literature exploring the impacts and options for restoration (Brevik & Sauer, 2015; Su et al. Turning to water contamination, destruction of wetlands and other ecosystems, that transform and accumulate nutrients (especially nitrogen and phosphorous) and toxins, often releases those directly back into waterways, to the detriment of human health. More intense land degradation programs like open-pit mining produce toxic runoff 378 Figure 5 5. Polluted soils also significantly decrease agricultural yields, and in downstream impacts, heavy metal toxicity in humans (especially from arsenic, lead, cadmium, or mercury) can lead to both acute illness, and long-term neurological damage. Urbanization consistently increases pollutant load, both water- and airborne, while decreasing or eliminating natural ecosystems that filter those toxins, leading to human health threats like atmospheric brown clouds (Myers et al. Similarly, urban and peri-urban slums with poor sanitation face a particularly severe risk from cholera outbreaks and from diarrhea and the responsible bacteria.
In 1841 prostate adenoma order flomax toronto, Bird wrote a report on the value of electricity in the treatment of disease prostate surgery flomax 0.4mg visa. Many of the women at the clinic were treated for lack of menstruation or irregular menstruation androgen hormone 15 best purchase flomax. He found that electricity was useful in treating facial paralysis and paralysis of the limbs androgen hormone 2 ep2 order flomax now. Many of the patients suffering from the pain of trench foot had immediate relief from diathermy and electricity prostate 9 complex vitamin 0.4mg flomax visa. Faradism was usually produced with low-voltage and low-frequency alternating current androgen hormone of pregnancy cheapest generic flomax uk. The alternating current was surged with gradually increasing strength until definite contractions of the abdominal wall were evident. The contractions were synchronized with the speed of breathing, about 20 breaths per minute. Faradism was most useful when patients had obscure pains in the back, buttocks, or thighs. Then he used the movable electrode to probe the sensitive area and increased the current until the patient could feel it. A woman was in a car accident that resulted in pain in her back, which kept her awake all night. A dispersive electrode was placed under the arm, and the moveable electrode was adjusted so it could just barely be felt. The entire area of the back was tested, and the painful area was located between the first lumbar vertebra. When the sensitive spot was found, the electrode was kept over the area, and a strong current was briefly applied. A young man had paralysis and spastic contractions in the upper left shoulder and lower left leg after an auto accident. Electrical stimulation and submerging the hand and wrist in hot water for 15 minutes resulted in slow improvement. A 46-year-old man with multiple sclerosis developed progressive marked spasticity of the upper and lower body. Electrical stimulation allowed him to stand on crutches and walk with a little help. He would treat people with tabes and severe ataxic phenomena who could scarcely walk. The muscle has a single strong contraction instead of the tetanus produced by higher frequency current. In treating incontinence of urine, the indifferent electrode is placed on a pad on the lower back. It is placed on the perineum, the triangular area between the anus and the sexual organs. After three applications, he showed marked improvement; he was cured after eight treatments. He was treated by infrequently interrupted faradic currents for 20 minutes every other day for six weeks. There was improvement during the day after ten treatments, and he became normal after six weeks of treatment. A nervous eight-year-old girl passed urine involuntarily with the slightest excitement. The treatment was discontinued for a month, and then was resumed for another six weeks when she was determined to be cured. The treatments were also used for anal incontinence using two electrodes placed together in the high side of the anus. The pulses were increased until there were marked contractions of the sphincter muscles; the stimulation enabled function to become normal. The stimulation produced by faradic currents often cured or helped kidney disease. After 38 treatments with faradic currents and vibration, she had completely recovered. A faradic current was generated at 100 hertz from an induction coil to give the maximum motor response. The dispersive electrode was placed on the neck and the active electrode was moved slowly down the arm. She was able to resume work and continued with one treatment a week for the next six months. A woman had some infected teeth removed, but intense pain developed on the sides of her left index finger. Faradic currents were the result of mechanical interruption of a battery, which generated an induced current. The interrupter was a vibrating spring that was attracted to an electromagnet and then bounced away. In 1842, Professor Joseph Henry called attention to the phenomenon accompanying the discharge of a Leyden jar. This was confirmed by Lord Kelvin in 1855, and also Hermann von Helmholtz A slowly interrupted current produces a sudden muscular contraction followed by relaxation. As the frequency is increased there is first tetanism of the muscles, then little feeling. This was a sine wave generator that produced a current of 21,000 alternations per minute. It could produce wave variations; the surged current was used to treat bronchitis. The new forms of alternating current were widely used in medicine for many conditions. A worker suffered a spinal concussion in a railroad collision; his left side partially paralyzed. Morse wave current was applied twice to the cervical and mid-dorsal regions of his spine. Archives of Physical Medicine 31:766, 1950 "Continuous Tetanizing (Low Voltage) Currents for the Relief of Spasm" W. Archives of Physical Medicine 33:668, 1952 "Relaxation of Spasticity by Electrical Stimulation of Antagonist Muscles" M. Archives of Radiology and Electrotherapy 25:290, 1921 "Some Uses of Infrequently Interrupted Faradic Current" J. Leitch British Journal of Physical Medicine 1:166, 1938 "The Faradic Current: Technique, Application and Clinical Indication" A. British Journal of Physical Medicine 8:85, 1933 "A Review of 17 Years of Work in the Electrical Department" W. Turrell General Practitioner 10:121, 1939 "The Use of the Low-Frequency Current in Treatment" E. Rockwell Physical Therapeutics 44:251, 1926 "The Development of High-Frequency Currents" F. As soon as the current was introduced, the patient reacted with a jolt, and his body muscles stiffened; then he fell back on the bed without loss of consciousness. Naturally we who were conducting the experiment were under great emotional strain, and felt we had already taken quite a risk. We observed the same instantaneous, brief, generalized spasm, and soon after, the onset of the classic epileptic convulsion. Ugo Cerletti 1936 Ten years after the Leyden jar was invented in 1745, Richard Lovett claimed to have successfully treated mental illness by electric sparks and static current. The next day, Aldini applied shocks through his ear, and the cure progressed rapidly. He began with a Budapest laborer who lay rigidly in bed staring into the distance. This produced convulsions, and a series of additional camphor treatments were given. After the fifth set of convulsions, the patient asked doctors where he was and if he could have breakfast. The injections were painful, and a long delay occurred before the seizures started. There were bad side effects, so he tried to think of other ways to give mental patients epileptic convulsions. The researchers tried treating schizophrenics by putting them into an insulin coma. The blood sugar level fell so low after an injection of insulin, that it produced a type of shock. He found that the chest current was the most dangerous and the head current was the least dangerous. The pigs normally regained consciousness in about five minutes and got on their feet. Cerletti found that passing current across the body from hand to hand was more dangerous because it crossed the heart. The butchers slashed their necks so the pigs bleed to death while they were in shock. Using the knowledge obtained by working on pigs, Cerletti and Bini decided to try this on a human patient. He noted that the tiresome whistling in his ears that troubled him for years disappeared. There was a great deal of concern that shock treatments might have some hidden danger. The "annihilation" method results in severe amnesia reactions that have a good influence in obsessive states, psychogenic depression, and in some paranoid cases. Several treatments daily were given for three to four days followed by a three-day rest. After the first shock treatment, he hallucinated and felt that people were talking about him. A 20-year-old woman had severe tics involving her head, face, arms, hand, and legs. After shock treatment, the blood plasma of shocked patients was injected into mental patients. Some of the first treatments were nothing more than a normal 110-volt power line connected to a hand switch. About half the schizophrenics subjected to the treatment showed a major improvement. He used this to give ten shocks with a one-second interval between each shock per treatment. A 34-year-old woman developed auditory hallucinations, and the voices told her to end her life. She was taken to the Psychiatric Ward of the Minneapolis General Hospital in 1939. Ron Hubbard unleashed a national attack on psychiatry and electroshock with the Church of Scientology. There were considerable doubts about the treatment and the wisdom of shocking the brain. It has been called sledgehammer psychiatry which o often produces prolonged memory loss in patients. American Journal of Psychiatry 107:87, 1950 "Old and New Information About Electroshock" U. Fink Annals of the New York Academy of Science 462:1, 1986 "History of Convulsive Therapy" L. Kalinowsky British Journal of Psychiatry 153:157, 1988 "Electricity: A History of Its Use in the Treatment of Mental Illness in Britain During the Second Half of the Nineteenth Century" A. British Medical Journal 2:1269, 1939 "Electrically Induced Convulsions in Treatment of Mental Disorders" W. British Medical Journal 2:779, 1940 "Treatment of Out-Patients by Electrical Convulsions and Therapy with a Portable Apparatus" E. Bulletin of the History of Medicine 22:156, 1948 "The Use of Electricity in Psychiatric Treatment During the Nineteenth Century" E. Stainbrook Journal-Lancet 59:351, 1939 "Faradic Shock Treatment of the Functional Psychoses" N. Twombly New York State Journal of Medicine 42:1553, 1942 "Psychopathologic Reactions and Electric-Shock Therapy" B. Scottish Medical Journal 4:373, 1959 "Historical Aspects of Electric Convulsant Therapy" R. Mowbray Tri-State Medical Journal 4:April/19, 1956 "The Electroshock Therapy; History and Indications" J. This action is not temporary; it continues even after the cessation of the treatment. He was greatly interested in electric treatment and studied the French electric therapies. He devised a special circuit with the static generator connected to two Leyden jars. He called this the static induced current, but it became known as the Morton wave current. The negative terminal was grounded, and a wet sponge or a metal disk electrode was connected to the positive terminal.
Purchase flomax 0.4 mg. 10 Minute Abs Workout! (MENS FITNESS MAGAZINE).