937 resultados para Last two millennia
Resumo:
Prospective cohort studies have provided evidence on longer-term mortality risks of fine particulate matter (PM2.5), but due to their complexity and costs, only a few have been conducted. By linking monitoring data to the U.S. Medicare system by county of residence, we developed a retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), comprising over 20 million enrollees in the 250 largest counties during 2000-2002. We estimated log-linear regression models having as outcome the age-specific mortality rate for each county and as the main predictor, the average level for the study period 2000. Area-level covariates were used to adjust for socio-economic status and smoking. We reported results under several degrees of adjustment for spatial confounding and with stratification into by eastern, central and western counties. We estimated that a 10 µg/m3 increase in PM25 is associated with a 7.6% increase in mortality (95% CI: 4.4 to 10.8%). We found a stronger association in the eastern counties than nationally, with no evidence of an association in western counties. When adjusted for spatial confounding, the estimated log-relative risks drop by 50%. We demonstrated the feasibility of using Medicare data to establish cohorts for follow-up for effects of air pollution. Particulate matter (PM) air pollution is a global public health problem (1). In developing countries, levels of airborne particles still reach concentrations at which serious health consequences are well-documented; in developed countries, recent epidemiologic evidence shows continued adverse effects, even though particle levels have declined in the last two decades (2-6). Increased mortality associated with higher levels of PM air pollution has been of particular concern, giving an imperative for stronger protective regulations (7). Evidence on PM and health comes from studies of acute and chronic adverse effects (6). The London Fog of 1952 provides dramatic evidence of the unacceptable short-term risk of extremely high levels of PM air pollution (8-10); multi-site time-series studies of daily mortality show that far lower levels of particles are still associated with short-term risk (5)(11-13). Cohort studies provide complementary evidence on the longer-term risks of PM air pollution, indicating the extent to which exposure reduces life expectancy. The design of these studies involves follow-up of cohorts for mortality over periods of years to decades and an assessment of mortality risk in association with estimated long-term exposure to air pollution (2-4;14-17). Because of the complexity and costs of such studies, only a small number have been conducted. The most rigorously executed, including the Harvard Six Cities Study and the American Cancer Society’s (ACS) Cancer Prevention Study II, have provided generally consistent evidence for an association of long- term exposure to particulate matter air pollution with increased all-cause and cardio-respiratory mortality (2,4,14,15). Results from these studies have been used in risk assessments conducted for setting the U.S. National Ambient Air Quality Standard (NAAQS) for PM and for estimating the global burden of disease attributable to air pollution (18,19). Additional prospective cohort studies are necessary, however, to confirm associations between long-term exposure to PM and mortality, to broaden the populations studied, and to refine estimates by regions across which particle composition varies. Toward this end, we have used data from the U.S. Medicare system, which covers nearly all persons 65 years of age and older in the United States. We linked Medicare mortality data to (particulate matter less than 2.5 µm in aerodynamic diameter) air pollution monitoring data to create a new retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), consisting of 20 million persons from 250 counties and representing about 50% of the US population of elderly living in urban settings. In this paper, we report on the relationship between longer-term exposure to PM2.5 and mortality risk over the period 2000 to 2002 in the MCAPS.
Resumo:
A variety of modified nucleosides to improve antisense oligodeoxynucleotide properties such as target affinity, nuclease resistance, and pharmacokinetics were developed in the last two decades. In the context of conformational restriction we present here the synthesis of the [4.3.0]-bicyclo-DNA thymine monomer via Pd(II)-mediated ring expansion of an intermediate of the tricyclo-DNA synthesis.
Resumo:
OBJECTIVE: The mortality rate in paediatric intensive care units (PICU) has fallen over the last two decades. More advanced treatment is offered to children with life-threatening disease and there is substantial interest in knowing whether long term outcome and quality of life after intensive care are acceptable. SETTING: 12-bed paediatric and neonatal intensive care unit. INTERVENTION: Prospective follow-up study with telephone interview 1 and 2 years after discharge. METHODS: Four domains of quality of life (physical function, role function, social-emotional function and health problem) were recorded by calculating the health state classification (HSC) index. Outcome was classified good (HSC 1.0-0.7), moderate (HSC 0.69-0.3), poor (HSC 0.29-0) and very poor (HSC <0). RESULTS: 661 patients were admitted to the PICU in the year 2001 with a mortality within the unit of 3.9%. Over 2 years follow-up there were 21 additional deaths (3.2%). 574 patients could be followed up after 1 year and 464 patients after 2 years. After two years the outcome was good in 77%, moderate in 15% and poor in 8%. Patients with respiratory disease had the best outcome, similar to those admitted for neurological and medical reasons. Patients admitted for postoperative care and for cardiovascular disease had a poorer quality of life. 31% of the children had preexisting health care problems and 21% of all patients had new chronic disease after intensive care. CONCLUSION: The majority of survivors admitted to the PICU have a good outcome. The overall mortality rate doubled if assessed two years after discharge.
Resumo:
Following last two years’ workshop on dynamic languages at the ECOOP conference, the Dyla 2007 workshop was a successful and popular event. As its name implies, the workshop’s focus was on dynamic languages and their applications. Topics and discussions at the workshop included macro expansion mechanisms, extension of the method lookup algorithm, language interpretation, reflexivity and languages for mobile ad hoc networks. The main goal of this workshop was to bring together different dynamic language communities and favouring cross communities interaction. Dyla 2007 was organised as a full day meeting, partly devoted to presentation of submitted position papers and partly devoted to tool demonstration. All accepted papers can be downloaded from the workshop’s web site. In this report, we provide an overview of the presentations and a summary of discussions.
Resumo:
Peripheral arterial occlusive disease (PAOD) is a manifestation of systemic atherosclerosis strongly associated with a high risk of cardiovascular morbidity and mortality. In a considerable proportion of patients with PAOD, revascularization either by endovascular means or by open surgery combined with best possible risk factor modification does not achieve limb salvage or relief of ischaemic rest pain. As a consequence, novel therapeutic strategies have been developed over the last two decades aiming to promote neovascularization and remodelling of collaterals. Gene and stem cell therapy are the main directions for clinical investigation concepts. For both, preclinical studies have shown promising results using a wide variety of genes encoding for growth factors and populations of adult stem cells, respectively. As a consequence, clinical trials have been performed applying gene and stem cell-based concepts. However, it has become apparent that a straightforward translation into humans is not possible. While several trials reported relief of symptoms and functional improvement, other trials did not confirm this early promise of efficacy. Ongoing clinical trials with an improved study design are needed to confirm the potential that gene and cell therapy may have and to prevent the gaps in our scientific knowledge that will jeopardize the establishment of angiogenic therapy as an additional medical treatment of PAOD. This review summarizes the experimental background and presents the current status of clinical applications and future perspectives of the therapeutic use of gene and cell therapy strategies for PAOD.
Resumo:
Hyperglycosylated human chorionic gonadotropin (H-hCG) is secreted by the placenta in early pregnancy. Decreased H-hCG levels have been associated with abortion in spontaneous pregnancy. We retrospectively measured H-hCG and dimeric hCG in the sera of 87 in vitro fertilization patients obtained in the 3 weeks following embryo transfer and set the results in relation to pregnancy outcome. H-hCG and dimeric hCG were correlated (r(2) = 0.89), and were significantly decreased in biochemical pregnancy (2 microg/l and 18 IU/l, respectively) compared to early pregnancy loss (22 microg/l and 331 IU/l) and ongoing pregnancy (32 microg/l and 353 IU/l). Only H-hCG tended to discriminate between these last two groups.
Resumo:
The remarkable advances in nanoscience and nanotechnology over the last two decades allow one to manipulate individuals atoms, molecules and nanostructures, make it possible to build devices with only a few nanometers, and enhance the nano-bio fusion in tackling biological and medical problems. It complies with the ever-increasing need for device miniaturization, from magnetic storage devices, electronic building blocks for computers, to chemical and biological sensors. Despite the continuing efforts based on conventional methods, they are likely to reach the fundamental limit of miniaturization in the next decade, when feature lengths shrink below 100 nm. On the one hand, quantum mechanical efforts of the underlying material structure dominate device characteristics. On the other hand, one faces the technical difficulty in fabricating uniform devices. This has posed a great challenge for both the scientific and the technical communities. The proposal of using a single or a few organic molecules in electronic devices has not only opened an alternative way of miniaturization in electronics, but also brought up brand-new concepts and physical working mechanisms in electronic devices. This thesis work stands as one of the efforts in understanding and building of electronic functional units at the molecular and atomic levels. We have explored the possibility of having molecules working in a wide spectrum of electronic devices, ranging from molecular wires, spin valves/switches, diodes, transistors, and sensors. More specifically, we have observed significant magnetoresistive effect in a spin-valve structure where the non-magnetic spacer sandwiched between two magnetic conducting materials is replaced by a self-assembled monolayer of organic molecules or a single molecule (like a carbon fullerene). The diode behavior in donor(D)-bridge(B)-acceptor(A) type of single molecules is then discussed and a unimolecular transistor is designed. Lastly, we have proposed and primarily tested the idea of using functionalized electrodes for rapid nanopore DNA sequencing. In these studies, the fundamental roles of molecules and molecule-electrode interfaces on quantum electron transport have been investigated based on first-principles calculations of the electronic structure. Both the intrinsic properties of molecules themselves and the detailed interfacial features are found to play critical roles in electron transport at the molecular scale. The flexibility and tailorability of the properties of molecules have opened great opportunity in a purpose-driven design of electronic devices from the bottom up. The results that we gained from this work have helped in understanding the underlying physics, developing the fundamental mechanism and providing guidance for future experimental efforts.
Resumo:
This report is a PhD dissertation proposal to study the in-cylinder temperature and heat flux distributions within a gasoline turbocharged direct injection (GTDI) engine. Recent regulations requiring automotive manufacturers to increase the fuel efficiency of their vehicles has led to great technological achievements in internal combustion engines. These achievements have increased the power density of gasoline engines dramatically in the last two decades. Engine technologies such as variable valve timing (VVT), direct injection (DI), and turbocharging have significantly improved engine power-to-weight and power-to-displacement ratios. A popular trend for increasing vehicle fuel economy in recent years has been to downsize the engine and add VVT, DI, and turbocharging technologies so that a lighter more efficient engine can replace a larger, heavier one. With the added power density, thermal management of the engine becomes a more important issue. Engine components are being pushed to their temperature limits. Therefore it has become increasingly important to have a greater understanding of the parameters that affect in-cylinder temperatures and heat transfer. The proposed research will analyze the effects of engine speed, load, relative air-fuel ratio (AFR), and exhaust gas recirculation (EGR) on both in-cylinder and global temperature and heat transfer distributions. Additionally, the effect of knocking combustion and fuel spray impingement will be investigated. The proposed research will be conducted on a 3.5 L six cylinder GTDI engine. The research engine will be instrumented with a large number of sensors to measure in-cylinder temperatures and pressures, as well as, the temperature, pressure, and flow rates of energy streams into and out of the engine. One of the goals of this research is to create a model that will predict the energy distribution to the crankshaft, exhaust, and cooling system based on normalized values for engine speed, load, AFR, and EGR. The results could be used to aid in the engine design phase for turbocharger and cooling system sizing. Additionally, the data collected can be used for validation of engine simulation models, since in-cylinder temperature and heat flux data is not readily available in the literature..
Resumo:
The push for improved fuel economy and reduced emissions has led to great achievements in engine performance and control. These achievements have increased the efficiency and power density of gasoline engines dramatically in the last two decades. With the added power density, thermal management of the engine has become increasingly important. Therefore it is critical to have accurate temperature and heat transfer models as well as data to validate them. With the recent adoption of the 2025 Corporate Average Fuel Economy(CAFE) standard, there has been a push to improve the thermal efficiency of internal combustion engines even further. Lean and dilute combustion regimes along with waste heat recovery systems are being explored as options for improving efficiency. In order to understand how these technologies will impact engine performance and each other, this research sought to analyze the engine from both a 1st law energy balance perspective, as well as from a 2nd law exergy analysis. This research also provided insights into the effects of various parameters on in-cylinder temperatures and heat transfer as well as provides data for validation of other models. It was found that the engine load was the dominant factor for the energy distribution, with higher loads resulting in lower coolant heat transfer and higher brake work and exhaust energy. From an exergy perspective, the exhaust system provided the best waste heat recovery potential due to its significantly higher temperatures compared to the cooling circuit. EGR and lean combustion both resulted in lower combustion chamber and exhaust temperatures; however, in most cases the increased flow rates resulted in a net increase in the energy in the exhaust. The exhaust exergy, on the other hand, was either increased or decreased depending on the location in the exhaust system and the other operating conditions. The effects of dilution from lean operation and EGR were compared using a dilution ratio, and the results showed that lean operation resulted in a larger increase in efficiency than the same amount of dilution with EGR. Finally, a method for identifying fuel spray impingement from piston surface temperature measurements was found. Note: The material contained in this section is planned for submission as part of a journal article and/or conference paper in the future.
Resumo:
Colloidal Nano-apatite Particles with Active Luminescent and Magentic Properties for Biotechnology Applications. The synthesis of functional nano-materials is a burgeoning field that has produced remarkable and consistent breakthroughs over the last two decades. Individual particles have become smaller and shown potential for well defined functionality. However, there are still unresolved problems, a primary one being the loss of functionality and novelty due to uncontrolled aggregation driven by surface energy considerations. As such the first design criteria to harness the true potential of nanoparticles is to prevent unwanted agglomeration by: (1) improving, and, if possible, (2) controlling aggregation behavior. This requires specific knowledge of the chemistry of the immediate locale of the intended application; especially for biologically relevant applications. The latter criterion is also application driven but should be considered, generally, to diversify the range of functional properties that can be achieved. We have now reason to believe that such a novel system with multifunctional capabilities can be synthesized rather conveniently and have far reaching impact in biotechnology and other applications in the near future. We are presently experimenting with the syntheses of spheroidal, metal-doped, colloidal apatite nano-particles (~10 nm) for several potential biomedical applications.
Resumo:
To what extent is “software engineering” really “engineering” as this term is commonly understood? A hallmark of the products of the traditional engineering disciplines is trustworthiness based on dependability. But in his keynote presentation at ICSE 2006 Barry Boehm pointed out that individuals’, systems’, and peoples’ dependency on software is becoming increasingly critical, yet that dependability is generally not the top priority for software intensive system producers. Continuing in an uncharacteristic pessimistic vein, Professor Boehm said that this situation will likely continue until a major software-induced system catastrophe similar in impact to the 9/11 World Trade Center catastrophe stimulates action toward establishing accountability for software dependability. He predicts that it is highly likely that such a software-induced catastrophe will occur between now and 2025. It is widely understood that software, i.e., computer programs, are intrinsically different from traditionally engineered products, but in one aspect they are identical: the extent to which the well-being of individuals, organizations, and society in general increasingly depend on software. As wardens of the future through our mentoring of the next generation of software developers, we believe that it is our responsibility to at least address Professor Boehm’s predicted catastrophe. Traditional engineering has, and continually addresses its social responsibility through the evolution of the education, practice, and professional certification/licensing of professional engineers. To be included in the fraternity of professional engineers, software engineering must do the same. To get a rough idea of where software engineering currently stands on some of these issues we conducted two surveys. Our main survey was sent to software engineering academics in the U.S., Canada, and Australia. Among other items it sought detail information on their software engineering programs. Our auxiliary survey was sent to U.S. engineering institutions to get some idea about how software engineering programs compared with those in established engineering disciplines of Civil, Electrical, and Mechanical Engineering. Summaries of our findings can be found in the last two sections of our paper.
Resumo:
For the last two decades American police experts developed new police philosophies in order to tackle more successful the increasing crime problems. Community Policing tries to improve the cooperation between the population and the police and to increase the trust in the police. A crucial factor is a meaningful cooperation between the police and the citizens. Problem Oriented Policing aims at structural changes in the organisation and the procedures of the police in public. The police have to investigate the hidden problems and conflicts of an individual offence and to create proactive and long term concepts for the social area of conflicts beyond the specific case. It is doubtful whether these philosophies can be implemented in Germany since the legality principle prohibits meaningful, trustworthy relationships between citizens and police officers. However, if one examines the results of surveys on citizens views and expectations towards the police one finds that the majority of the German citizens favour the postulates of community and problem oriented policing. They expect through these measures an improvement of their life situation in the community and the feelings of safety. If one takes these results seriously one has to question if the legality principle is still appropriate. It seems to hamper new, more promising policing styles which seem to improve life of it's citizens and reflect what the citizens want and expect from their police force.
Resumo:
To assess bone mineral density (BMD) in idiopathic calcium nephrolithiasis, dual-energy x-ray absorptiometry was performed at lumbar spine, upper femur (femoral neck, Ward's triangle, and total area), distal tibial diaphysis, and distal tibial epiphysis in 110 male idiopathic calcium stone formers (ICSF); 49 with and 61 without hypercalciuria on free-choice diet). Results were compared with those obtained in 234 healthy male controls, using (1) noncorrected BMD, (2) BMD corrected for age, height, and BMI, and (3) a skeletal score based on a tercile distribution of BMD values at following four sites: lumbar spine, Ward's triangle, tibial diaphysis, and tibial epiphysis. After correction, BMD--and therefore also skeletal score--tended to be lower in the stone formers than in controls at five of the six measurement sites, that is, lumbar spine, upper femur, Ward's triangle, tibial diaphysis, and tibial epiphysis, limit of significance being reached for the last two sites without difference between hypercalciuric (HCSF) and normocalciuric stone formers (NCSF). Estimated current daily calcium intake was significantly lower in patients (616 +/- 499 mg/24 h, mean +/- SEM) than in controls (773 +/- 532, p = 0.02). Of 17 patients who in the past had received a low-calcium diet for at least 1 year, 10 had a low skeletal score (4-6) whereas only 1 had a high score (10-12; p = 0.037). Of the 12 stone formers in the study with skeletal score 4 (i.e., the lowest), 8 had experienced in the past one or more fractures of any kind versus only 19 of the remaining 77 patients with skeletal score 5-12 (p = 0.01).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
The last two years a discussion on reforming the public sector has emerged. At its very heart we find important concepts like ‘quality reform’, ‘democracy’, and ‘development’. Recently I have presented an example of the ‘quality reform’ in SocMag, and this leads me to prolong that discussion on central themes on welfare state and democracy. Much energy is invested in arguing about management of the public sector: Do we need more competition from private companies? Do we need more control? Are more contracts concerning outcome needed? Can we be sure about the accountability needed from politicians? How much documentation, effectiveness measurement, bureaucracy, and evidence-based policy and practice are we looking for? A number of interesting questions – but strange enough we do not discuss the purpose of ‘keeping a welfare state’. What sort of understanding is lying behind the welfare state, and what kind of democracy are we drawing upon?
Resumo:
Complementary to automatic extraction processes, Virtual Reality technologies provide an adequate framework to integrate human perception in the exploration of large data sets. In such multisensory system, thanks to intuitive interactions, a user can take advantage of all his perceptual abilities in the exploration task. In this context the haptic perception, coupled to visual rendering, has been investigated for the last two decades, with significant achievements. In this paper, we present a survey related to exploitation of the haptic feedback in exploration of large data sets. For each haptic technique introduced, we describe its principles and its effectiveness.