989 resultados para Engineers without Borders challenge


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A strain of avian influenza A virus was adapted to grow in mouse peritoneal macrophages in vitro. The adapted strain, called M-TUR, induced a marked cytopathic effect in macrophages from susceptible mice. Mice homozygous (A2G) or heterozygous (F1 hybrids between A2G and several susceptible strains) for the gene Mx, shown previously to induce a high level of resistance towards lethal challenge by a number of myxoviruses in vivo, yielded peritoneal macrophages which were not affected by M-TUR. Peritoneal macrophages could be classified as resistant or susceptible to M-TUR without sacrificing the cell donor. Backcrosses were arranged between (A2G X A/J)F1 and A/J mice. 64 backcross animals could be tested individually both for resistance of their macrophages in vitro after challenge with M-TUR, and for resistance of the whole animal in vivo after challenge with NWS (a neurotropic variant of human influenza A virus). Macrophages from 36 backcross mice were classified as susceptible, and all of these mice died after challenge. Macrophages from 28 mice were classified as resistant, and 26 mice survived challenge. We conclude that resistance of macrophages and resistance of the whole animal are two facets of the same phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary: Friedreich's ataxia (FRDA), the most common autosomal recessive ataxia, is characterised by progressive ataxia with dysarthria of speech, loss of deep-tendon reflexes, impaired vibratory and proprioceptive sensations and corticospinal weakness with a Babinski's sign. Patients eventually also develop kyphoscoliosis, cardiomyopathy and diabetes mellitus. The disease is a GAA repeat disorder resulting in severely reduced levels of frataxin, with secondary increased sensitivity to oxidative stress. The anti-oxidative drug, idebenone, is effective against FRDA-associated cardiomyopathy. We provide detailed clinical, electrophysiological and biochemical data from 20 genetically confirmed FRDA patients and have analysed the relation-ship between phenotype, genotype and malondialdehyde (MDA), which is a marker of superoxide formation. We assessed the effects of idebenone biochemically by measuring blood M DA and clinically by serial measurements of the International Cooperative Ataxia Rating Scale (ICARS). The GAA repeat length influenced the age at onset (p <0.001), the severity of ataxia (p= 0.02), the presence of cardiomyopathy (p =0.04) and of low-frequency hearing loss (p = 0.009). Multilinear regression analysis showed (p = 0.006) that ICARS was dependent on the two variables of disease duration (p = 0.01) and size of the GAA expansion (p = 0.02). We found no correlation to bilateral palpebral ptosis visual impairment, diabetes mellitus or skeletal deformities, all of which appear to be signs of disease progression rather than severity. We discuss more thoroughly two underrecognised clinical findings: palpebral ptosis and GAA length-dependent low-frequency hearing loss. The average ICARS remained unchanged in 10 patients for whom follow-up on treatment was available (mean 2.9 years), whereas most patients treated with idebenone reported an improvement in dysarthria (63%), hand dexterity (.58%) and fatigue (47%) after taking the drug for several weeks or months. Oxidative stress analysis showed an unexpected increase in blood MDA levels in patients on idebenone (p = 0.04), and we discuss the putative underlying mechanism for this result, which could then explain the unique efficacy of idebenone in treating the FRDA-associated cardiomyopathy, as opposed to other antioxidative drugs. Indeed, idebenone is not only a powerful stimulator of complexes II and III of the respiratory chain, but also an inhibitor of complex I activity, then promoting superoxide formation. Our preliminary clinical observations are the first to date supporting an effect of idebenone in delaying neurological worsening. Our MDA results point to the dual effect of idebenone on oxidative stress and to the need for controlled studies to assess its potential toxicity at high doses on the one hand, and to revisit the exact mechanisms underlying the .physiopathology of Friedreich's ataxia on the other hand, while recent reports suggest non-oxidative pathophysiology of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is on finding the potential of using the activity-based costing (ABC) method when formulating cost information for the needs of design engineers. This paper suggests that ABC is more useful than the traditional job order costing, but the negative issue is the fact that ABC models become easily too complicated, i.e. expensive to build and maintain, and difficult to use. For engineering design the most suitable elements of ABC are recognizing activities of the company, constructing acitivity chains, identifying resources, activity and cost drivers, as wellas calculating accurate product costs. ABC systems including numerous cost drivers can become complex. Therefore, a comprehensive ABC based cost information system for the use of design engineers should be considered criticaly. Combining the suitable ideas of ABC with engineering oriented thinking could give competentresults.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Values and value processes are said to be needed in every organization nowadays, as the world is changing and companies have to have something to "keep it together". Organizational values, which are approvedand used by the personnel, could be the key. Every organization has values. But what is the real value of values? The greatest and most crucial challenge is the feasibility of the value process. The main point in this thesis is tostudy how organizational members at different hierarchical levels perceive values and value processes in their organizations. This includes themes such as how values are disseminated, the targets of value processing, factors that affect the process, problems that occur during the value implementation and improvements that could be made when organizational values are implemented. These subjects are studied from the perspective of organizational members (both managers and employees); individuals in the organizations. The aim is to get the insider-perspective on value processing, from multiple hierarchical levels. In this research I study three different organizations (forest industry, bank and retail cooperative) and their value processes. The data is gathered from companies interviewing personnel in the head office and at the local level. The individuals areseen as members of organizations, and the cultural aspect is topical throughout the whole study. Values and cultures are seen as the 'actuality of reality' of organizations, interpreted by organizational members. The three case companies were chosen because they represented different lines of business and they all implemented value processing differently. Sincethe emphasis in this study is at the local level, the similar size of the local units was also an important factor. Values are in 'fashion' -but what does the fashion tell us about the real corporate practices? In annual reports companies emphasize the importance and power of official values. But what is the real 'point' of values? Values are publicly respected and advertised, but still it seems that the words do not meet the deeds. There is a clear conflict between theoretical, official and substantive organizational values: in the value processing from words to real action. This contradiction in value processing is studied through individual perceptions in this study. I study the kinds of perceptions organizationalmembers have when values are processed from the head office to the local level: the official value process is studied from the individual's perspective. Value management has been studied more during the 1990's. The emphasis has usually been on managers: how they consider the values in organizations and what effects it has on the management. Recent literature has emphasized values as tools for improving company performance. The value implementation as a process has been studied through 'good' and 'bad' examples, as if one successful value process could be copied to all organizations. Each company is different with different cultures and personnel, so no all-powerful way of processing values exists. In this study, the organizational members' perceptions at different hierarchical levels are emphasized. Still, managers are also interviewed; this is done since managerial roles in value dissemination are crucial. Organizational values cannot be well disseminated without management; this has been proved in several earlier studies (e.g. Kunda 1992, Martin 1992, Parker 2000). Recent literature has not sufficiently emphasized the individual's (organizational member's) role in value processing. Organizations consist of differentindividuals with personal values, at all hierarchical levels. The aim in this study is to let the individual take the floor. Very often the value process is described starting from the value definition and ending at dissemination, and the real results are left without attention. I wish to contribute to this area. Values are published officially in annual reports etc. as a 'goal' just like profits. Still, the results/implementationof value processing is rarely followed, at least in official reports. This is a very interesting point: why do companies espouse values, if there is no real control or feedback after the processing? In this study, the personnel in three different companies is asked to give an answer. In the empirical findings, there are several results which bring new aspects to the research area of organizational values. The targets of value processing, factors effecting value processing, the management's roles and the problems in value implementation are presented through the individual's perspective. The individual's perceptions in value processing are a recurring theme throughout the whole study. A comparison between the three companies with diverse value processes makes the research complete

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In orange commercial farms, Zn deficiencies symptoms and small fruits were observed in Corrientes, Argentine. During four years (1995 to 1998), Valencia orange (Citrus sinensis Osb.) on Rough lemon (C. jambhiri Lush.) rootstock, implanted in 1974 in sandy soil, where six treatments were tested. Treatments varied from 1 to 3 Kg KCl.tree-1.year-1 (applied in April and December) with and without Zineb 80, 0,35%. year-1, 20 L. tree-1 (13,3 g Zn.tree-1 applied in December). The experimental design was a randomized complete block with four replications, with a single tree and borders in the experimental plot. Foliar sample were taken every year in Autumn and Summer, foliar concentrations of Zn and K were determined by atomic spectrum absorption. Harvested fruits were classified into small, medium and big. Analysis of Variance, Tukey test and Pearson correlations between production and foliar concentrations were performed. Higher fertilization levels of K with Zn increased medium and big fruits production (Kg and percentage). Foliar concentrations of K and Zn were positively correlated with big and medium fruit production and negatively correlated with small one. Chemical names used: Ethilenbis-ditiocarbamate of Zn (Zineb).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimization of most pesticide and fertilizer applications is based on overall grove conditions. In this work we measurements. Recently, Wei [9, 10] used a terrestrial propose a measurement system based on a ground laser scanner to LIDAR to measure tree height, width and volume developing estimate the volume of the trees and then extrapolate their foliage a set of experiments to evaluate the repeatability and surface in real-time. Tests with pear trees demonstrated that the accuracy of the measurements, obtaining a coefficient of relation between the volume and the foliage can be interpreted as variation of 5.4% and a relative error of 4.4% in the linear with a coefficient of correlation (R) of 0.81 and the foliar estimation of the volume but without real-time capabilities. surface can be estimated with an average error less than 5 %.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, edge matching puzzles, an NP-complete problem, have received, thanks to money-prized contests, considerable attention from wide audiences. We consider these competitions not only a challenge for SAT/CSP solving techniques but also as an opportunity to showcase the advances in the SAT/CSP community to a general audience. This paper studies the NP-complete problem of edge matching puzzles focusing on providing generation models of problem instances of variable hardness and on its resolution through the application of SAT and CSP techniques. From the generation side, we also identify the phase transition phenomena for each model. As solving methods, we employ both; SAT solvers through the translation to a SAT formula, and two ad-hoc CSP solvers we have developed, with different levels of consistency, employing several generic and specialized heuristics. Finally, we conducted an extensive experimental investigation to identify the hardest generation models and the best performing solving techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IMPORTANCE: Cerebral amyloid-β aggregation is an early pathological event in Alzheimer disease (AD), starting decades before dementia onset. Estimates of the prevalence of amyloid pathology in persons without dementia are needed to understand the development of AD and to design prevention studies. OBJECTIVE: To use individual participant data meta-analysis to estimate the prevalence of amyloid pathology as measured with biomarkers in participants with normal cognition, subjective cognitive impairment (SCI), or mild cognitive impairment (MCI). DATA SOURCES: Relevant biomarker studies identified by searching studies published before April 2015 using the MEDLINE and Web of Science databases and through personal communication with investigators. STUDY SELECTION: Studies were included if they provided individual participant data for participants without dementia and used an a priori defined cutoff for amyloid positivity. DATA EXTRACTION AND SYNTHESIS: Individual records were provided for 2914 participants with normal cognition, 697 with SCI, and 3972 with MCI aged 18 to 100 years from 55 studies. MAIN OUTCOMES AND MEASURES: Prevalence of amyloid pathology on positron emission tomography or in cerebrospinal fluid according to AD risk factors (age, apolipoprotein E [APOE] genotype, sex, and education) estimated by generalized estimating equations. RESULTS: The prevalence of amyloid pathology increased from age 50 to 90 years from 10% (95% CI, 8%-13%) to 44% (95% CI, 37%-51%) among participants with normal cognition; from 12% (95% CI, 8%-18%) to 43% (95% CI, 32%-55%) among patients with SCI; and from 27% (95% CI, 23%-32%) to 71% (95% CI, 66%-76%) among patients with MCI. APOE-ε4 carriers had 2 to 3 times higher prevalence estimates than noncarriers. The age at which 15% of the participants with normal cognition were amyloid positive was approximately 40 years for APOE ε4ε4 carriers, 50 years for ε2ε4 carriers, 55 years for ε3ε4 carriers, 65 years for ε3ε3 carriers, and 95 years for ε2ε3 carriers. Amyloid positivity was more common in highly educated participants but not associated with sex or biomarker modality. CONCLUSIONS AND RELEVANCE: Among persons without dementia, the prevalence of cerebral amyloid pathology as determined by positron emission tomography or cerebrospinal fluid findings was associated with age, APOE genotype, and presence of cognitive impairment. These findings suggest a 20- to 30-year interval between first development of amyloid positivity and onset of dementia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: The objectives of this study are to present the technique and results of endoscopic repair of laryngotracheoesophageal clefts (LTEC) extending caudally to the cricoid plate into the cervical trachea and to revisit the classification of LTEC. METHODS: The authors conducted a retrospective case analysis consisting of four infants with complete laryngeal clefts (extending through the cricoid plate in three cases and down into the cervical trachea in one case) treated endoscopically by CO2 laser incision of the mucosa and two-layer endoscopic closure of the cleft without postoperative intubation or tracheotomy. RESULTS: All four infants resumed spontaneous respiration without support after a mean postoperative period of 3 days with continuous positive airway pressure (CPAP). They accepted oral feeding within 5 postoperative days (range, 3-11 days). No breakdown of endoscopic repair was encountered. After a mean follow up of 48 months (range, 3 mos to 7 y), all children have a good voice, have no sign of residual aspiration, but experience a slight exertional dyspnea. CONCLUSION: This limited experience on the endoscopic repair of extrathoracic LTEC shows that a minimally invasive approach sparing the need for postoperative intubation or tracheotomy is feasible and safe if modern technology (ultrapulse CO2 laser, endoscopic suturing, and postoperative use of CPAP in the intensive care unit) is available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forest management for groundwater protection is a cheap solution for a vital question, which is implemented for decades all over the world. The main challenge is to insure a constant adequate forest management to preserve the service provided. In Lombok Island, the problem is the lack of implementation of the public regulation in the forest area. Therefore payments for environmental services (PES) are used as an alternative in this weak institutional environment. The results of the field research show that, surprisingly, the "famous" Lombok PES case is not a PES at all, even if there are some payments. This research has however happy ends because other "forest for water" PES have been identified in the field. In addition, the legal review identified a way to solve the lack of legal base for PES implementation. Thus, the PES examples that we identified could be spread all over Indonesia without conflicting other regulations (fiscal, local finance, forest, etc.) and circumventing the forest administrations.