899 resultados para ree software environment for statistical computing and graphics R


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic anticipation is defined as a decrease in age of onset or increase in severity as the disorder is transmitted through subsequent generations. Anticipation has been noted in the literature for over a century. Recently, anticipation in several diseases including Huntington's Disease, Myotonic Dystrophy and Fragile X Syndrome were shown to be caused by expansion of triplet repeats. Anticipation effects have also been observed in numerous mental disorders (e.g. Schizophrenia, Bipolar Disorder), cancers (Li-Fraumeni Syndrome, Leukemia) and other complex diseases. ^ Several statistical methods have been applied to determine whether anticipation is a true phenomenon in a particular disorder, including standard statistical tests and newly developed affected parent/affected child pair methods. These methods have been shown to be inappropriate for assessing anticipation for a variety of reasons, including familial correlation and low power. Therefore, we have developed family-based likelihood modeling approaches to model the underlying transmission of the disease gene and penetrance function and hence detect anticipation. These methods can be applied in extended families, thus improving the power to detect anticipation compared with existing methods based only upon parents and children. The first method we have proposed is based on the regressive logistic hazard model. This approach models anticipation by a generational covariate. The second method allows alleles to mutate as they are transmitted from parents to offspring and is appropriate for modeling the known triplet repeat diseases in which the disease alleles can become more deleterious as they are transmitted across generations. ^ To evaluate the new methods, we performed extensive simulation studies for data simulated under different conditions to evaluate the effectiveness of the algorithms to detect genetic anticipation. Results from analysis by the first method yielded empirical power greater than 87% based on the 5% type I error critical value identified in each simulation depending on the method of data generation and current age criteria. Analysis by the second method was not possible due to the current formulation of the software. The application of this method to Huntington's Disease and Li-Fraumeni Syndrome data sets revealed evidence for a generation effect in both cases. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The built environment is part of the physical environment made by people and for people. Because the built environment is such a ubiquitous component of the environment, it acts as an important pathway in determining health outcomes. Zoning, a type of urban planning policy, is one of the most important mechanisms connecting the built environment to public health. This policy analysis research paper explores how zoning regulations in Austin, Texas promote or prohibit the development of a healthy built environment. A systematic literature review was obtained from Active Living Research, which contained literature published about the relationships between the built environment, physical activity, and health. The results of these studies identified the following four components of the built environment that were associated to health: access to recreational facilities, sprawl and residential density, land use mix, and sidewalks and their walkability. A hierarchy analysis was then performed to demonstrate the association between these aspects of the built environment and health outcomes such as obesity, cardiovascular disease, and general health. Once these associations had been established, the components of the built environment were adapted into the evaluation criteria used to conduct a public health analysis of Austin's zoning ordinance. A total of eighty-eight regulations were identified to be related to these components and their varying associations to human health. Eight regulations were projected to have a negative association to health, three would have both a positive and negative association simultaneously, and nine were indeterminable with the information obtained through the literature review. The remaining sixty-eight regulations were projected to be associated in a beneficial manner to human health. Therefore, it was concluded that Austin's zoning ordinance would have an overwhelmingly positive impact on the public's health based on identified associations between the built environment and health outcomes.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Physical Activity (PA) is a central part in the fight to reduce obesity rates that are higher in Mexican Americans in the United States than any other ethnic groups. More than half of all Americans do not meet the daily PA recommendations and 48% of Mexican Americans do not exercise. The built environment is believed to affect participation in physical activity. The influence of the built environmental on physical activity levels in low-income Mexican Americans living along the Texas-Mexico border has not been investigated. ^ Purpose. The purpose of this secondary data analysis was trifold: (1) to determine the levels of self-reported PA in adults living in Brownsville, Texas; (2) to characterize the perceptions of this population regarding the built environment; and (3) to determine the association between self-reported PA and the built environment in Mexican Americans living in Brownsville, Texas. ^ Methods. 400 participants from the Tu Salud ¡Sí Cuenta! (TSSC) community-wide campaign were included in this secondary data analysis. Percentages for level of physical activity and the built environment were calculated using SPSS. Perceptions of the built environment were assessed by 14 items. Logistic regression analysis was used to assess the relationship between physical activity and built environment. All models were adjusted for age, gender, and level of education. ^ Results. The majority of men (41.97%) and women (59%), combined (56.7%)did not meet the 2008 PA Guidelines for Americans. We analyzed 14 built environment variables to characterize participants’ perceptions of the built environment. We conducted odds ratio (OR) to find if those who met PA levels associated the built environment such as neighborhood shops ([OR:1.806], CI:1.074,3.038 ]) bus stops ([OR:1.436], CI:.806,2.558) unattended stray dogs ([OR: 1.806], CI:1. 074,3.038), sidewalk access ([OR: .858],CI:.437,1.686), access to free parks ([OR:.549],CI:.335,.900) heavy traffic in neighborhood ([OR:.802], CI:.501,1.285), crime rate ([OR:.779], CI:.494,1.228) ranked the highest by mean score. The association between physical activity and the perceived built environment factors for Mexican Americans participating in the TSSCStudy were weakly associated. ^ Conclusions. This study provides evidence that PA levels are low in this Mexican American population. The built environment factors assessed in this study characterized the need for further studies of the variables that are seen as important to the Mexican American population. Lastly, the association of PA levels to the built environment was weak overall and further studies are recommended of the built environment.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation was undertaken to determine the chemical characterization of inhalable particulate matter in the Houston area, with special emphasis on source identification and apportionment of outdoor and indoor atmospheric aerosols using multivariate statistical analyses.^ Fine (<2.5 (mu)m) particle aerosol samples were collected by means of dichotomous samplers at two fixed site (Clear Lake and Sunnyside) ambient monitoring stations and one mobile monitoring van in the Houston area during June-October 1981 as part of the Houston Asthma Study. The mobile van allowed particulate sampling to take place both inside and outside of twelve homes.^ The samples collected for 12-h sampling on a 7 AM-7 PM and 7 PM-7 AM (CDT) schedule were analyzed for mass, trace elements, and two anions. Mass was determined gravimetrically. An energy-dispersive X-ray fluorescence (XRF) spectrometer was used for determination of elemental composition. Ion chromatography (IC) was used to determine sulfate and nitrate.^ Average chemical compositions of fine aerosol at each site were presented. Sulfate was found to be the largest single component in the fine fraction mass, comprising approximately 30% of the fine mass outdoors and 12% indoors, respectively.^ Principal components analysis (PCA) was applied to identify sources of aerosols and to assess the role of meteorological factors on the variation in particulate samples. The results suggested that meteorological parameters were not associated with sources of aerosol samples collected at these Houston sites.^ Source factor contributions to fine mass were calculated using a combination of PCA and stepwise multivariate regression analysis. It was found that much of the total fine mass was apparently contributed by sulfate-related aerosols. The average contributions to the fine mass coming from the sulfate-related aerosols were 56% of the Houston outdoor ambient fine particulate matter and 26% of the indoor fine particulate matter.^ Characterization of indoor aerosol in residential environments was compared with the results for outdoor aerosols. It was suggested that much of the indoor aerosol may be due to outdoor sources, but there may be important contributions from common indoor sources in the home environment such as smoking and gas cooking. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The freezing and desiccation tolerance of 12 Klebsormidium strains, isolated from various habitats (aero-terrestrial, terrestrial, and hydro-terrestrial) from distinct geographical regions (Antarctic - South Shetlands, King George Island, Arctic - Ellesmere Island, Svalbard, Central Europe - Slovakia) were studied. Each strain was exposed to several freezing (-4°C, -40°C, -196°C) and desiccation (+4°C and +20°C) regimes, simulating both natural and semi-natural freeze-thaw and desiccation cycles. The level of resistance (or the survival capacity) was evaluated by chlorophyll a content, viability, and chlorophyll fluorescence evaluations. No statistical differences (Kruskal-Wallis tests) between strains originating from different regions were observed. All strains tested were highly resistant to both freezing and desiccation injuries. Freezing down to -196°C was the most harmful regime for all studied strains. Freezing at -4°C did not influence the survival of studied strains. Further, freezing down to -40°C (at a speed of 4°C/min) was not fatal for most of the strains. RDA analysis showed that certain Antarctic and Arctic strains did not survive desiccation at +4°C; however, freezing at -40°C, as well as desiccation at +20 °C was not fatal to them. On the other hand, other strains from the Antarctic, the Arctic, and Central Europe (Slovakia) survived desiccation at temperatures of +4°C, and freezing down to -40°C. It appears that species of Klebsormidium which occupy an environment where both seasonal and diurnal variations of water availability prevail, are well adapted to freezing and desiccation injuries. Freezing and desiccation tolerance is not species-specific nor is the resilience only found in polar strains as it is also a feature of temperate strains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ice cores from outside the Greenland and Antarctic ice sheets are difficult to date because of seasonal melting and multiple sources (terrestrial, marine, biogenic and anthropogenic) of sulfates deposited onto the ice. Here we present a method of volcanic sulfate extraction that relies on fitting sulfate profiles to other ion species measured along the cores in moving windows in log space. We verify the method with a well dated section of the Belukha ice core from central Eurasia. There are excellent matches to volcanoes in the preindustrial, and clear extraction of volcanic peaks in the post-1940 period when a simple method based on calcium as a proxy for terrestrial sulfate fails due to anthropogenic sulfate deposition. We then attempt to use the same statistical scheme to locate volcanic sulfate horizons within three ice cores from Svalbard and a core from Mount Everest. Volcanic sulfate is <5% of the sulfate budget in every core, and differences in eruption signals extracted reflect the large differences in environment between western, northern and central regions of Svalbard. The Lomonosovfonna and Vestfonna cores span about the last 1000 years, with good extraction of volcanic signals, while Holtedahlfonna which extends to about AD1700 appears to lack a clear record. The Mount Everest core allows clean volcanic signal extraction and the core extends back to about AD700, slightly older than a previous flow model has suggested. The method may thus be used to extract historical volcanic records from a more diverse geographical range than hitherto.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Global Ocean Sampling (GOS) expedition is currently the largest and geographically most comprehensive metagenomic dataset, including samples from the Atlantic, Pacific, and Indian Oceans. This study makes use of the wide range of environmental conditions and habitats encompassed within the GOS sites in order to investigate the ecological structuring of bacterial and archaeal taxon ranks. Community structures based on taxonomically classified 16S ribosomal RNA (rRNA) gene fragments at phylum, class, order, family, and genus rank levels were examined using multivariate statistical analysis, and the results were inspected in the context of oceanographic environmental variables and structured habitat classifications. At all taxon rank levels, community structures of neritic, oceanic, estuarine biomes, as well as other exotic biomes (salt marsh, lake, mangrove), were readily distinguishable from each other. A strong structuring of the communities with chlorophyll a concentration and a weaker yet significant structuring with temperature and salinity were observed. Furthermore, there were significant correlations between community structures and habitat classification. These results were used for further investigation of one-to-one relationships between taxa and environment and provided indications for ecological preferences shaped by primary production for both cultured and uncultured bacterial and archaeal clades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing complexity of current software systems is encouraging the development of self-managed software architectures, i.e. systems capable of reconfiguring their structure at runtime to fulfil a set of goals. Several approaches have covered different aspects of their development, but some issues remain open, such as the maintainability or the scalability of self-management subsystems. Centralized approaches, like self-adaptive architectures, offer good maintenance properties but do not scale well for large systems. On the contrary, decentralized approaches, like self-organising architectures, offer good scalability but are not maintainable: reconfiguration specifications are spread and often tangled with functional specifications. In order to address these issues, this paper presents an aspect-oriented autonomic reconfiguration approach where: (1) each subsystem is provided with self-management properties so it can evolve itself and the components that it is composed of; (2) self-management concerns are isolated and encapsulated into aspects, thus improving its reuse and maintenance. Povzetek: Predstavljen je pristop s samo-preoblikovanjem programske arhitekture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a proposal of a language called Link which has been designed to formalize and operationalize problem solving strategies. This language is used within a software environment called KSM (Knowledge Structure Manager) which helps developers in formulating and operationalizing structured knowledge models. The paper presents both its syntax and dynamics, and gives examples of well-known problem-solving strategies of reasoning formulated using this language.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We argüe that in order to exploit both Independent And- and Or-parallelism in Prolog programs there is advantage in recomputing some of the independent goals, as opposed to all their solutions being reused. We present an abstract model, called the Composition-Tree, for representing and-or parallelism in Prolog Programs. The Composition-tree closely mirrors sequential Prolog execution by recomputing some independent goals rather than fully re-using them. We also outline two environment representation techniques for And-Or parallel execution of full Prolog based on the Composition-tree model abstraction. We argüe that these techniques have advantages over earlier proposals for exploiting and-or parallelism in Prolog.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The future Internet is expected to be composed of a mesh of interoperable web services accessible from all over the web. This approach has not yet caught on since global user?service interaction is still an open issue. This paper states one vision with regard to next-generation front-end Web 2.0 technology that will enable integrated access to services, contents and things in the future Internet. In this paper, we illustrate how front-ends that wrap traditional services and resources can be tailored to the needs of end users, converting end users into prosumers (creators and consumers of service-based applications). To do this, we propose an architecture that end users without programming skills can use to create front-ends, consult catalogues of resources tailored to their needs, easily integrate and coordinate front-ends and create composite applications to orchestrate services in their back-end. The paper includes a case study illustrating that current user-centred web development tools are at a very early stage of evolution. We provide statistical data on how the proposed architecture improves these tools. This paper is based on research conducted by the Service Front End (SFE) Open Alliance initiative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Access to information and continuous education represent critical factors for physicians and researchers over the world. For African professionals, this situation is even more problematic due to the frequently difficult access to technological infrastructures and basic information. Both education and information technologies (e.g., including hardware, software or networking) are expensive and unaffordable for many African professionals. Thus, the use of e-learning and an open approach to information exchange and software use have been already proposed to improve medical informatics issues in Africa. In this context, the AFRICA BUILD project, supported by the European Commission, aims to develop a virtual platform to provide access to a wide range of biomedical informatics and learning resources to professionals and researchers in Africa. A consortium of four African and four European partners work together in this initiative. In this framework, we have developed a prototype of a cloud-computing infrastructure to demonstrate, as a proof of concept, the feasibility of this approach. We have conducted the experiment in two different locations in Africa: Burundi and Egypt. As shown in this paper, technologies such as cloud computing and the use of open source medical software for a large range of case present significant challenges and opportunities for developing countries, such as many in Africa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is remarkable growing concern about the quality control at the time, which has led to the search for methods capable of addressing effectively the reliability analysis as part of the Statistic. Managers, researchers and Engineers must understand that 'statistical thinking' is not just a set of statistical tools. They should start considering 'statistical thinking' from a 'system', which means, developing systems that meet specific statistical tools and other methodologies for an activity. The aim of this article is to encourage them (engineers, researchers and managers) to develop a new way of thinking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last century many researches on the business, marketing and technology fields have developed the innovation research line and large amount of knowledge can be found in the literature. Currently, the importance of systematic and openness approaches to manage the available innovation sources is well established in many knowledge fields. Also in the software engineering sector, where the organizations need to absorb and to exploit as much innovative ideas as possible to get success in the current competitive environment. This Master Thesis presents an study related with the innovation sources in the software engineering eld. The main research goals of this work are the identication and the relevance assessment of the available innovation sources and the understanding of the trends on the innovation sources usage. Firstly, a general review of the literature have been conducted in order to define the research area and to identify research gaps. Secondly, the Systematic Literature Review (SLR) has been proposed as the research method in this work to report reliable conclusions collecting systematically quality evidences about the innovation sources in software engineering field. This contribution provides resources, built-on empirical studies included in the SLR, to support a systematic identication and an adequate exploitation of the innovation sources most suitable in the software engineering field. Several artefacts such as lists, taxonomies and relevance assessments of the innovation sources most suitable for software engineering have been built, and their usage trends in the last decades and their particularities on some countries and knowledge fields, especially on the software engineering, have been researched. This work can facilitate to researchers, managers and practitioners of innovative software organizations the systematization of critical activities on innovation processes like the identication and exploitation of the most suitable opportunities. Innovation researchers can use the results of this work to conduct research studies involving the innovation sources research area. Whereas, organization managers and software practitioners can use the provided outcomes in a systematic way to improve their innovation capability, increasing consequently the value creation in the processes that they run to provide products and services useful to their environment. In summary, this Master Thesis research the innovation sources in the software engineering field, providing useful resources to support an effective innovation sources management. Moreover, several aspects should be deeply study to increase the accuracy of the presented results and to obtain more resources built-on empirical knowledge. It can be supported by the INno- vation SOurces MAnagement (InSoMa) framework, which is introduced in this work in order to encourage openness and systematic approaches to identify and to exploit the innovation sources in the software engineering field.