76 resultados para Applying technology
Resumo:
Life cycle analyses (LCA) approaches require adaptation to reflect the increasing delocalization of production to emerging countries. This work addresses this challenge by establishing a country-level, spatially explicit life cycle inventory (LCI). This study comprises three separate dimensions. The first dimension is spatial: processes and emissions are allocated to the country in which they take place and modeled to take into account local factors. Emerging economies China and India are the location of production, the consumption occurs in Germany, an Organisation for Economic Cooperation and Development country. The second dimension is the product level: we consider two distinct textile garments, a cotton T-shirt and a polyester jacket, in order to highlight potential differences in the production and use phases. The third dimension is the inventory composition: we track CO2, SO2, NO (x), and particulates, four major atmospheric pollutants, as well as energy use. This third dimension enriches the analysis of the spatial differentiation (first dimension) and distinct products (second dimension). We describe the textile production and use processes and define a functional unit for a garment. We then model important processes using a hierarchy of preferential data sources. We place special emphasis on the modeling of the principal local energy processes: electricity and transport in emerging countries. The spatially explicit inventory is disaggregated by country of location of the emissions and analyzed according to the dimensions of the study: location, product, and pollutant. The inventory shows striking differences between the two products considered as well as between the different pollutants considered. For the T-shirt, over 70% of the energy use and CO2 emissions occur in the consuming country, whereas for the jacket, more than 70% occur in the producing country. This reversal of proportions is due to differences in the use phase of the garments. For SO2, in contrast, over two thirds of the emissions occur in the country of production for both T-shirt and jacket. The difference in emission patterns between CO2 and SO2 is due to local electricity processes, justifying our emphasis on local energy infrastructure. The complexity of considering differences in location, product, and pollutant is rewarded by a much richer understanding of a global production-consumption chain. The inclusion of two different products in the LCI highlights the importance of the definition of a product's functional unit in the analysis and implications of results. Several use-phase scenarios demonstrate the importance of consumer behavior over equipment efficiency. The spatial emission patterns of the different pollutants allow us to understand the role of various energy infrastructure elements. The emission patterns furthermore inform the debate on the Environmental Kuznets Curve, which applies only to pollutants which can be easily filtered and does not take into account the effects of production displacement. We also discuss the appropriateness and limitations of applying the LCA methodology in a global context, especially in developing countries. Our spatial LCI method yields important insights in the quantity and pattern of emissions due to different product life cycle stages, dependent on the local technology, emphasizing the importance of consumer behavior. From a life cycle perspective, consumer education promoting air-drying and cool washing is more important than efficient appliances. Spatial LCI with country-specific data is a promising method, necessary for the challenges of globalized production-consumption chains. We recommend inventory reporting of final energy forms, such as electricity, and modular LCA databases, which would allow the easy modification of underlying energy infrastructure.
Resumo:
Direct MR arthrography has a better diagnostic accuracy than MR imaging alone. However, contrast material is not always homogeneously distributed in the articular space. Lesions of cartilage surfaces or intra-articular soft tissues can thus be misdiagnosed. Concomitant application of axial traction during MR arthrography leads to articular distraction. This enables better distribution of contrast material in the joint and better delineation of intra-articular structures. Therefore, this technique improves detection of cartilage lesions. Moreover, the axial stress applied on articular structures may reveal lesions invisible on MR images without traction. Based on our clinical experience, we believe that this relatively unknown technique is promising and should be further developed.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
The antibody display technology (ADT) such as phage display (PD) has substantially improved the production of monoclonal antibodies (mAbs) and Ab fragments through bypassing several limitations associated with the traditional approach of hybridoma technology. In the current study, we capitalized on the PD technology to produce high affinity single chain variable fragment (scFv) against tumor necrosis factor-alpha (TNF- α), which is a potent pro-inflammatory cytokine and plays important role in various inflammatory diseases and malignancies. To pursue production of scFv antibody fragments against human TNF- α, we performed five rounds of biopanning using stepwise decreased amount of TNF-α (1 to 0.1 μ g), a semi-synthetic phage antibody library (Tomlinson I + J) and TG1 cells. Antibody clones were isolated and selected through enzyme-linked immunosorbent assay (ELISA) screening. The selected scFv antibody fragments were further characterized by means of ELISA, PCR, restriction fragment length polymorphism (RFLP) and Western blot analyses as well as fluorescence microscopy and flow cytometry. Based upon binding affinity to TNF-α , 15 clones were selected out of 50 positive clones enriched from PD in vitro selection. The selected scFvs displayed high specificity and binding affinity with Kd values at nm range to human TNF-α . The immunofluorescence analysis revealed significant binding of the selected scFv antibody fragments to the Raji B lymphoblasts. The effectiveness of the selected scFv fragments was further validated by flow cytometry analysis in the lipopolysaccharide (LPS) treated mouse fibroblast L929 cells. Based upon these findings, we propose the selected fully human anti-TNF-α scFv antibody fragments as potential immunotherapy agents that may be translated into preclinical/clinical applications.
Resumo:
The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.
Resumo:
The global human population is expected to reach ∼9 billion by 2050. Feeding this many people represents a major challenge requiring global crop yield increases of up to 100%. Microbial symbionts of plants such as arbuscular mycorrhizal fungi (AMF) represent a huge, but unrealized resource for improving yields of globally important crops, especially in the tropics. We argue that the application of AMF in agriculture is too simplistic and ignores basic ecological principals. To achieve this challenge, a community and population ecology approach can contribute greatly. First, ecologists could significantly improve our understanding of the determinants of the survival of introduced AMF, the role of adaptability and intraspecific diversity of AMF and whether inoculation has a direct or indirect effect on plant production. Second, we call for extensive metagenomics as well as population genomics studies that are crucial to assess the environmental impact that introduction of non-local AMF may have on native AMF communities and populations. Finally, we plead for an ecologically sound use of AMF in efforts to increase food security at a global scale in a sustainable manner.
Resumo:
In the Arabidopsis thaliana genome, over 1000 putative genes encoding small, presumably secreted, signalling peptides can be recognized. However, a major obstacle in identifying the function of genes encoding small signalling peptides is the limited number of available loss-of-function mutants. To overcome this, a promising new tool, antagonistic peptide technology, was recently developed. Here, this antagonistic peptide technology was tested on selected CLE peptides and the related IDA peptide and its usefulness in the context of studies of peptide function discussed. Based on the analyses, it was concluded that the antagonistic peptide approach is not the ultimate means to overcome redundancy or lack of loss-of-function lines. However, information collected using antagonistic peptide approaches (in the broad sense) can be very useful, but these approaches do not work in all cases and require a deep insight on the interaction between the ligand and its receptor to be successful. This, as well as peptide ligand structure considerations, should be taken into account before ordering a wide range of synthetic peptide variants and/or generating transgenic plants.
Resumo:
INTRODUCTION: The decline of malaria and scale-up of rapid diagnostic tests calls for a revision of IMCI. A new algorithm (ALMANACH) running on mobile technology was developed based on the latest evidence. The objective was to ensure that ALMANACH was safe, while keeping a low rate of antibiotic prescription. METHODS: Consecutive children aged 2-59 months with acute illness were managed using ALMANACH (2 intervention facilities), or standard practice (2 control facilities) in Tanzania. Primary outcomes were proportion of children cured at day 7 and who received antibiotics on day 0. RESULTS: 130/842 (15∙4%) in ALMANACH and 241/623 (38∙7%) in control arm were diagnosed with an infection in need for antibiotic, while 3∙8% and 9∙6% had malaria. 815/838 (97∙3%;96∙1-98.4%) were cured at D7 using ALMANACH versus 573/623 (92∙0%;89∙8-94∙1%) using standard practice (p<0∙001). Of 23 children not cured at D7 using ALMANACH, 44% had skin problems, 30% pneumonia, 26% upper respiratory infection and 13% likely viral infection at D0. Secondary hospitalization occurred for one child using ALMANACH and one who eventually died using standard practice. At D0, antibiotics were prescribed to 15∙4% (12∙9-17∙9%) using ALMANACH versus 84∙3% (81∙4-87∙1%) using standard practice (p<0∙001). 2∙3% (1∙3-3.3) versus 3∙2% (1∙8-4∙6%) received an antibiotic secondarily. CONCLUSION: Management of children using ALMANACH improve clinical outcome and reduce antibiotic prescription by 80%. This was achieved through more accurate diagnoses and hence better identification of children in need of antibiotic treatment or not. The building on mobile technology allows easy access and rapid update of the decision chart. TRIAL REGISTRATION: Pan African Clinical Trials Registry PACTR201011000262218.
Resumo:
Thegoalofthepresentreviewistoexplainhowimmersivevirtualenvironmenttechnology(IVET)canbeusedforthestudyofsocialinteractionsandhowtheuseofvirtualhumansinimmersivevirtualenvironmentscanadvanceresearchandapplicationinmanydifferentfields.Researchersstudyingindividualdifferencesinsocialinteractionsaretypicallyinterestedinkeepingthebehaviorandtheappearanceoftheinteractionpartnerconstantacrossparticipants.WithIVETresearchershavefullcontrolovertheinteractionpartners,canstandardizethemwhilestillkeepingthesimulationrealistic.Virtualsimulationsarevalid:growingevidenceshowsthatindeedstudiesconductedwithIVETcanreplicatesomewell-knownfindingsofsocialpsychology.Moreover,IVETallowsresearcherstosubtlymanipulatecharacteristicsoftheenvironment(e.g.,visualcuestoprimeparticipants)orofthesocialpartner(e.g.,his/herrace)toinvestigatetheirinfluencesonparticipants'behaviorandcognition.Furthermore,manipulationsthatwouldbedifficultorimpossibleinreallife(e.g.,changingparticipants'height)canbeeasilyobtainedwithIVET.Besidetheadvantagesfortheoreticalresearch,weexplorethemostrecenttrainingandclinicalapplicationsofIVET,itsintegrationwithothertechnologies(e.g.,socialsensing)andfuturechallengesforresearchers(e.g.,makingthecommunicationbetweenvirtualhumansandparticipantssmoother).
Resumo:
BACKGROUND: Frequent emergency department (ED) users meet several of the criteria of vulnerability, but this needs to be further examined taking into consideration all vulnerability's different dimensions. This study aimed to characterize frequent ED users and to define risk factors of frequent ED use within a universal health care coverage system, applying a conceptual framework of vulnerability. METHODS: A controlled, cross-sectional study comparing frequent ED users to a control group of non-frequent users was conducted at the Lausanne University Hospital, Switzerland. Frequent users were defined as patients with five or more visits to the ED in the previous 12 months. The two groups were compared using validated scales for each one of the five dimensions of an innovative conceptual framework: socio-demographic characteristics; somatic, mental, and risk-behavior indicators; and use of health care services. Independent t-tests, Wilcoxon rank-sum tests, Pearson's Chi-squared test and Fisher's exact test were used for the comparison. To examine the -related to vulnerability- risk factors for being a frequent ED user, univariate and multivariate logistic regression models were used. RESULTS: We compared 226 frequent users and 173 controls. Frequent users had more vulnerabilities in all five dimensions of the conceptual framework. They were younger, and more often immigrants from low/middle-income countries or unemployed, had more somatic and psychiatric comorbidities, were more often tobacco users, and had more primary care physician (PCP) visits. The most significant frequent ED use risk factors were a history of more than three hospital admissions in the previous 12 months (adj OR:23.2, 95%CI = 9.1-59.2), the absence of a PCP (adj OR:8.4, 95%CI = 2.1-32.7), living less than 5 km from an ED (adj OR:4.4, 95%CI = 2.1-9.0), and household income lower than USD 2,800/month (adj OR:4.3, 95%CI = 2.0-9.2). CONCLUSIONS: Frequent ED users within a universal health coverage system form a highly vulnerable population, when taking into account all five dimensions of a conceptual framework of vulnerability. The predictive factors identified could be useful in the early detection of future frequent users, in order to address their specific needs and decrease vulnerability, a key priority for health care policy makers. Application of the conceptual framework in future research is warranted.