880 resultados para Building information modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As was the case in 2010 when the National Institutes of Health issued a consensus statement on the prevention of Alzheimer’s and other dementias, there remains a lack of firm evidence for dementia prevention. Because of the difficulties in studying this phenomenon, no modifiable risk factors for dementia have been definitively established, and no pharmaceutical or nutritional supplements been proven to prevent Alzheimer’s disease or cognitive decline. However, longitudinal observational studies have identified several factors associated with dementia. A recent review article summarizes the current epidemiological evidence about Alzheimer’s and other dementias, and presents three ongoing large scale randomized control trials (RCTs) that focus on preventing dementia. The review argues that there is substantial evidence for many factors that, in combination, might reduce the risk of, or delay the onset of, dementia. Although no specific cure for dementia exists, and no specific pathway between risk factor and disease onset has been identified, several cardiovascular, stress, toxicity, and psychosocial variables have been repeatedly associated with dementia. Protective factors, such as high education, physical exercise, and not smoking cigarettes, have been identified as well. Intervention studies that account for these multiple factors may well identify strategies for preventing or delaying dementia. However, the protective effects and risk factors suggested by observational data have yet to be assessed in RCT research. The role of such factors in reducing or increasing the risk for dementia needs to be more specifically defined. Three ongoing RCT studies in Europe show promise in this area, as they target multiple risk and protective factors by promoting healthy lifestyle changes and medical treatment of vascular diseases. These are: FINGER, a Finnish trial involving 1,200 older adults at risk for dementia. This intervention features nutritional guidance, physical activity, cognitive and social engagement, and medical management of risk factors. Participants were involved in previous, intensive observational studies of vascular health and health behavior, so FINGER will provide a level of relevant information about its research subjects that is normally impossible for clinical RCTs to attain;MAPT, a multicenter study of 1,680 frail older adults in France. This study will compare the efficacy of omega-3 dietary supplementation with a multidomain training intervention that involves physical and cognitive training. The study will include follow-up assessments after five years;PreDIVA, a Dutch study of 3,534 community dwelling participants between 70 and 78 years old, recruited from primary care clinics. This study will compare standard medical care with a multicomponent vascular health intervention. The study will last for six years and measure both dementia and disability outcomes. These studies are an important step in dementia research, using earlier observational studies as the basis for rigorously assessed interventions. Although a cure for dementia has not been identified, this new research may identify preventive strategies against dementia. �� Source: Mangialasche F, Kivipelto M, et al. (2012). Dementia prevention: current epidemiological evidence and future perspective. Alzheimer’s Research and Therapy 4:6.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excessive exposure to solar UV light is the main cause of skin cancers in humans. UV exposure depends on environmental as well as individual factors related to activity. Although outdoor occupational activities contribute significantly to the individual dose received, data on effective exposure are scarce and limited to a few occupations. A study was undertaken in order to assess effective short-term exposure among building workers and characterize the influence of individual and local factors on exposure. The effective exposure of construction workers in a mountainous area in the southern part of Switzerland was investigated through short-term dosimetry (97 dosimeters). Three altitudes, of about 500, 1500 and 2500 m were considered. Individual measurements over 20 working periods were performed using Spore film dosimeters on five body locations. The postural activity of workers was concomitantly recorded and static UV measurements were also performed. Effective exposure among building workers was high and exceeded occupational recommendations, for all individuals for at least one body location. The mean daily UV dose in plain was 11.9 SED (0.0-31.3 SED), in middle mountain 21.4 SED (6.6-46.8 SED) and in high mountain 28.6 SED (0.0-91.1 SED). Measured doses between workers and anatomical locations exhibited a high variability, stressing the role of local exposure conditions and individual factors. Short-term effective exposure ranged between 0 and 200% of ambient irradiation, indicating the occurrence of intense, subacute exposures. A predictive irradiation model was developed to investigate the role of individual factors. Posture and orientation were found to account for at least 38% of the total variance of relative individual exposure, and were also found to account more than altitude on the total variance of effective daily exposures. Targeted sensitization actions through professional information channels and specific prevention messages are recommended. Altitude outdoor workers should also benefit from preventive medical examination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 1986, several near-vertical seismic reflection profiles have been recorded in Switzerland in order to map the deep geologic structure of the Alps. One objective of this endeavour has been to determine the geometries of the autochthonous basement and of the external crystalline massifs, important elements for understanding the geodynamics of the Alpine orogeny. The PNR-20 seismic line W1, located in the Rawil depression of the western Swiss Alps, provides important information on this subject. It extends northward from the `'Penninic front'' across the Helvetic nappes to the Prealps. The crystalline massifs do not outcrop along this profile. Thus, the interpretation of `'near-basement'' reflections has to be constrained by down-dip projections of surface geology, `'true amplitude'' processing, rock physical property studies and modelling. 3-D seismic modelling has been used to evaluate the seismic response of two alternative down-dip projection models. To constrain the interpretation in the southern part of the profile, `'true amplitude'' processing has provided information on the strength of the reflections. Density and velocity measurements on core samples collected up-dip from the region of the seismic line have been used to evaluate reflection coefficients of typical lithologic boundaries in the region. The cover-basement contact itself is not a source of strong reflections, but strong reflections arise from within the overlaying metasedimentary cover sequence, allowing the geometry of the top of the basement to be determined on the basis of `'near-basement'' reflections. The front of the external crystalline massifs is shown to extend beneath the Prealps, about 6 km north of the expected position. A 2-D model whose seismic response shows reflection patterns very similar to the observed is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mountains and mountain societies provide a wide range of goods and services to humanity, but they are particularly sensitive to the effects of global environmental change. Thus, the definition of appropriate management regimes that maintain the multiple functions of mountain regions in a time of greatly changing climatic, economic, and societal drivers constitutes a significant challenge. Management decisions must be based on a sound understanding of the future dynamics of these systems. The present article reviews the elements required for an integrated effort to project the impacts of global change on mountain regions, and recommends tools that can be used at 3 scientific levels (essential, improved, and optimum). The proposed strategy is evaluated with respect to UNESCO's network of Mountain Biosphere Reserves (MBRs), with the intention of implementing it in other mountain regions as well. First, methods for generating scenarios of key drivers of global change are reviewed, including land use/land cover and climate change. This is followed by a brief review of the models available for projecting the impacts of these scenarios on (1) cryospheric systems, (2) ecosystem structure and diversity, and (3) ecosystem functions such as carbon and water relations. Finally, the cross-cutting role of remote sensing techniques is evaluated with respect to both monitoring and modeling efforts. We conclude that a broad range of techniques is available for both scenario generation and impact assessments, many of which can be implemented without much capacity building across many or even most MBRs. However, to foster implementation of the proposed strategy, further efforts are required to establish partnerships between scientists and resource managers in mountain areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present the theoretical and methodologicalfoundations for the development of a multi-agentSelective Dissemination of Information (SDI) servicemodel that applies Semantic Web technologies for specializeddigital libraries. These technologies make possibleachieving more efficient information management,improving agent–user communication processes, andfacilitating accurate access to relevant resources. Othertools used are fuzzy linguistic modelling techniques(which make possible easing the interaction betweenusers and system) and natural language processing(NLP) techniques for semiautomatic thesaurus generation.Also, RSS feeds are used as “current awareness bulletins”to generate personalized bibliographic alerts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building a personalized model to describe the drug concentration inside the human body for each patient is highly important to the clinical practice and demanding to the modeling tools. Instead of using traditional explicit methods, in this paper we propose a machine learning approach to describe the relation between the drug concentration and patients' features. Machine learning has been largely applied to analyze data in various domains, but it is still new to personalized medicine, especially dose individualization. We focus mainly on the prediction of the drug concentrations as well as the analysis of different features' influence. Models are built based on Support Vector Machine and the prediction results are compared with the traditional analytical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on the Promising Transition Practices shared at the September 28, 2007 Capacity Building Forum sponsored by Improving Transition Outcomes with Iowa Vocational Rehabilitation Services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital libraries (DL) are seen as the hope for developing countries in their struggle to access scientific and academic publications. However, building such libraries in developing countries is a real challenge. These countries usually face several difficulties, such as low computer and Internet penetration rates, poor ICT infrastructure, lack of qualified human resources, lack of financial resources, etc. Thus, it is imperative finding alternative mechanisms of building DL that best fit the specificities of these countries. This paper presents the process used for building a digital library at the University Jean Piaget of Cape Verde, created in a context of scarce access to printed materials and serious difficulties in accessing ICT resources. This paper also presents the challenges, the solutions and the adopted methodological framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document is intended to lay the foundation for resource reduction strategies in new construction, renovation and demolition. If you have an innovative idea or information that you believe should be included in future updates of this manual please email Shelly Codner at scodner@region12cog.org or Jan Loyson at Jan.Loyson@Iowalifechanging.com. Throughout this manual, we use the term “waste reduction” to define waste management initiatives that will result in less waste going to the landfill. In accordance with the waste management hierarchy these practices include reducing (waste prevention), reusing (deconstruction and salvage), recycling and renewing (making old things new again) - in that order. This manual will explain what these practices are and how to incorporate them into your projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Arabidopsis root meristem, polar auxin transport creates a transcriptional auxin response gradient that peaks at the stem cell niche and gradually decreases as stem cell daughters divide and differentiate [1-3]. The amplitude and extent of this gradient are essential for both stem cell maintenance and root meristem growth [4, 5]. To investigate why expression of some auxin-responsive genes, such as the essential root meristem growth regulator BREVIS RADIX (BRX) [6], deviates from this gradient, we combined experimental and computational approaches. We created cellular-level root meristem models that accurately reproduce distribution of nuclear auxin activity and allow dynamic modeling of regulatory processes to guide experimentation. Expression profiles deviating from the auxin gradient could only be modeled after intersection of auxin activity with the observed differential endocytosis pattern and positive autoregulatory feedback through plasma-membrane-to-nucleus transfer of BRX. Because BRX is required for expression of certain auxin response factor targets, our data suggest a cell-type-specific endocytosis-dependent input into transcriptional auxin perception. This input sustains expression of a subset of auxin-responsive genes across the root meristem's division and transition zones and is essential for meristem growth. Thus, the endocytosis pattern provides specific positional information to modulate auxin response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary The specific CD8+ T cell immune response against tumors relies on the recognition by the T cell receptor (TCR) on cytotoxic T lymphocytes (CTL) of antigenic peptides bound to the class I major histocompatibility complex (MHC) molecule. Such tumor associated antigenic peptides are the focus of tumor immunotherapy with peptide vaccines. The strategy for obtaining an improved immune response often involves the design of modified tumor associated antigenic peptides. Such modifications aim at creating higher affinity and/or degradation resistant peptides and require precise structures of the peptide-MHC class I complex. In addition, the modified peptide must be cross-recognized by CTLs specific for the parental peptide, i.e. preserve the structure of the epitope. Detailed structural information on the modified peptide in complex with MHC is necessary for such predictions. In this thesis, the main focus is the development of theoretical in silico methods for prediction of both structure and cross-reactivity of peptide-MHC class I complexes. Applications of these methods in the context of immunotherapy are also presented. First, a theoretical method for structure prediction of peptide-MHC class I complexes is developed and validated. The approach is based on a molecular dynamics protocol to sample the conformational space of the peptide in its MHC environment. The sampled conformers are evaluated using conformational free energy calculations. The method, which is evaluated for its ability to reproduce 41 X-ray crystallographic structures of different peptide-MHC class I complexes, shows an overall prediction success of 83%. Importantly, in the clinically highly relevant subset of peptide-HLAA*0201 complexes, the prediction success is 100%. Based on these structure predictions, a theoretical approach for prediction of cross-reactivity is developed and validated. This method involves the generation of quantitative structure-activity relationships using three-dimensional molecular descriptors and a genetic neural network. The generated relationships are highly predictive as proved by high cross-validated correlation coefficients (0.78-0.79). Together, the here developed theoretical methods open the door for efficient rational design of improved peptides to be used in immunotherapy. Résumé La réponse immunitaire spécifique contre des tumeurs dépend de la reconnaissance par les récepteurs des cellules T CD8+ de peptides antigéniques présentés par les complexes majeurs d'histocompatibilité (CMH) de classe I. Ces peptides sont utilisés comme cible dans l'immunothérapie par vaccins peptidiques. Afin d'augmenter la réponse immunitaire, les peptides sont modifiés de façon à améliorer l'affinité et/ou la résistance à la dégradation. Ceci nécessite de connaître la structure tridimensionnelle des complexes peptide-CMH. De plus, les peptides modifiés doivent être reconnus par des cellules T spécifiques du peptide natif. La structure de l'épitope doit donc être préservée et des structures détaillées des complexes peptide-CMH sont nécessaires. Dans cette thèse, le thème central est le développement des méthodes computationnelles de prédiction des structures des complexes peptide-CMH classe I et de la reconnaissance croisée. Des applications de ces méthodes de prédiction à l'immunothérapie sont également présentées. Premièrement, une méthode théorique de prédiction des structures des complexes peptide-CMH classe I est développée et validée. Cette méthode est basée sur un échantillonnage de l'espace conformationnel du peptide dans le contexte du récepteur CMH classe I par dynamique moléculaire. Les conformations sont évaluées par leurs énergies libres conformationnelles. La méthode est validée par sa capacité à reproduire 41 structures des complexes peptide-CMH classe I obtenues par cristallographie aux rayons X. Le succès prédictif général est de 83%. Pour le sous-groupe HLA-A*0201 de complexes de grande importance pour l'immunothérapie, ce succès est de 100%. Deuxièmement, à partir de ces structures prédites in silico, une méthode théorique de prédiction de la reconnaissance croisée est développée et validée. Celle-ci consiste à générer des relations structure-activité quantitatives en utilisant des descripteurs moléculaires tridimensionnels et un réseau de neurones couplé à un algorithme génétique. Les relations générées montrent une capacité de prédiction remarquable avec des valeurs de coefficients de corrélation de validation croisée élevées (0.78-0.79). Les méthodes théoriques développées dans le cadre de cette thèse ouvrent la voie du design de vaccins peptidiques améliorés.