938 resultados para PREDICTION METHOD


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Structural genomics initiatives aim to elucidate representative 3D structures for the majority of protein families over the next decade, but many obstacles must be overcome. The correct design of constructs is extremely important since many proteins will be too large or contain unstructured regions and will not be amenable to crystallization. It is therefore essential to identify regions in protein sequences that are likely to be suitable for structural study. Scooby-Domain is a fast and simple method to identify globular domains in protein sequences. Domains are compact units of protein structure and their correct delineation will aid structural elucidation through a divide-and-conquer approach. Scooby-Domain predictions are based on the observed lengths and hydrophobicities of domains from proteins with known tertiary structure. The prediction method employs an A*-search to identify sequence regions that form a globular structure and those that are unstructured. On a test set of 173 proteins with consensus CATH and SCOP domain definitions, Scooby-Domain has a sensitivity of 50% and an accuracy of 29%, which is better than current state-of-the-art methods. The method does not rely on homology searches and, therefore, can identify previously unknown domains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

All over the world, electrical power systems are encountering radical change stimulated by the urgent need to decarbonize electricity supply, to swap aging resources and to make effective application of swiftly evolving information and communication technologies (ICTs). All of these goals converge toward one direction; ‘Smart Grid.’ The Smart Grid can be described as the transparent, seamless, and instantaneous two-way delivery of energy information, enabling the electricity industry to better manage energy delivery and transmission and empowering consumers to have more control over energy decisions. Basically, the vision of Smart Grid is to provide much better visibility to lower-voltage networks as well as to permit the involvement of consumers in the function of the power system, mostly through smart meters and Smart Homes. A Smart Grid incorporates the features of advanced ICTs to convey real-time information and facilitate the almost instantaneous stability of supply and demand on the electrical grid. The operational data collected by Smart Grid and its sub-systems will allow system operators to quickly recognize the best line of attack to protect against attacks, susceptibility, and so on, sourced by a variety of incidents. However, Smart Grid initially depends upon knowing and researching key performance components and developing the proper education program to equip current and future workforce with the knowledge and skills for exploitation of this greatly advanced system. The aim of this chapter is to provide a basic discussion of the Smart Grid concept, evolution and components of Smart Grid, environmental impacts of Smart Grid and then in some detail, to describe the technologies that are required for its realization. Even though the Smart Grid concept is not yet fully defined, the chapter will be helpful in describing the key enabling technologies and thus allowing the reader to play a part in the debate over the future of the Smart Grid. The chapter concludes with the experimental description and results of developing a hybrid prediction method for solar power which is applicable to successfully implement the ‘Smart Grid.’

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we consider a self-excited mechanical system by dry friction in order to study the bifurcational behavior of the arisen vibrations. The oscillating system consists of a mass block-belt-system which is self-excited by static and Coulomb friction. We analyze the system behavior numerically through bifurcation diagrams, phase portraits, frequency spectra and Poincare maps, which show the existence of nonhomoclinic and homoclinic chaos and a route to homoclinic chaos. The homoclinic chaos is also analyzed analytically via the Melnikov prediction method. The system dynamic is characterized by the existence of two potential wells in the phase plane which exhibit rich bifurcational and chaotic behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciências Odontológicas - FOAR

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A number of autonomous underwater vehicles, AUV, are equipped with commercial ducted propellers, most of them produced originally for the remote operated vehicle, ROV, industry. However, AUVs and ROVs are supposed to work quite differently since the ROV operates in almost the bollard pull condition, while the AUV works at larger cruising speeds. Moreover, they can have an influence in the maneuverability of AUV due to the lift the duct generates in the most distant place of the vehicle's center of mass. In this work, it is proposed the modeling of the hydrodynamic forces and moment on a duct propeller according to a numerical (CFD) simulation, and analytical and semi-empirical, ASE, approaches. Predicted values are compared to experimental results produced in a towing tank. Results confirm the advantages of the symbiosis between CFD and ASE methods for modeling the influence of the propeller duct in the AUV maneuverability. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dynamic behaviour of a fishing vessel in waves is studied in order to reveal its parametric rolling characteristics. This paper presents experimental and numerical results in longitudinal regular waves. The experimental results are compared against the results of a time-domain non-linear strip theory model of ship motions in six degrees-of-freedom. These results contribute to the validation of the parametric rolling prediction method, so that it can be used as an assessment tool to evaluate both the susceptibility and severity of occurrence of parametric rolling at the early design stage of these types of vessels.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este estudio profundiza en la estimación de variables forestales a partir de información LiDAR en el Valle de la Fuenfría (Cercedilla, Madrid). Para ello se dispone de dos vuelos realizados con sensor LiDAR en los años 2002 y 2011 y en el invierno de 2013 se ha realizado un inventario de 60 parcelas de campo. En primer lugar se han estimado seis variables dasométricas (volumen, área basimétrica, biomasa total, altura dominante, densidad y diámetro medio cuadrático) para 2013, tanto a nivel de píxel como a nivel de rodal y monte. Se construyeron modelos de regresión lineal múltiple que permitieron estimar con precisión dichas variables. En segundo lugar, se probaron diferentes métodos para la estimación de la distribución diamétrica. Por un lado, el método de predicción de percentiles y, por otro lado, el método de predicción de parámetros. Este segundo método se probó para una función Weibull simple, una función Weibull doble y una combinación de ambas según la distribución que mejor se ajustaba a cada parcela. Sin embargo, ninguno de los métodos ha resultado suficientemente válido para predecir la distribución diamétrica. Por último se estimaron el crecimiento en volumen y área basimétrica a partir de la comparación de los vuelos del 2002 y 2011. A pesar de que la tecnología LiDAR era diferente y solo se disponía de un inventario completo, realizado en 2013, los modelos construidos presentan buenas bondades de ajuste. Asimismo, el crecimiento a nivel de pixel se ha mostrado estar relacionado de forma estadísticamente significativa con la pendiente, orientación y altitud media del píxel. ABSTRACT This project goes in depth on the estimation of forest attributes by means of LiDAR data in Fuenfria’s Valley (Cercedilla, Madrid). The available information was two LiDAR flights (2002 and 2011) and a forest inventory consisting of 60 plots (2013). First, six different dasometric attributes (volume, basal area, total aboveground biomass, top height, density and quadratic mean diameter) were estimated in 2013 both at a pixel, stand and forest level. The models were developed using multiple linear regression and were good enough to predict these attributes with great accuracy. Second, the measured diameter distribution at each plot was fitted to a simple and a double Weibull distribution and different methods for its estimation were tested. Neither parameter prediction method nor percentile prediction method were able to account for the diameter distribution. Finally, volume and top height growths were estimated comparing 2011 LiDAR flight with 2002 LiDAR flight. Even though the LiDAR technology was not the same and there was just one forest inventory with sample plots, the models properly explain the growth. Besides, growth at each pixel is significantly related to its average slope, orientation and altitude.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

"February 22, 1977."

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Elevated ocean temperatures can cause coral bleaching, the loss of colour from reef-building corals because of a breakdown of the symbiosis with the dinoflagellate Symbiodinium. Recent studies have warned that global climate change could increase the frequency of coral bleaching and threaten the long-term viability of coral reefs. These assertions are based on projecting the coarse output from atmosphere-ocean general circulation models (GCMs) to the local conditions around representative coral reefs. Here, we conduct the first comprehensive global assessment of coral bleaching under climate change by adapting the NOAA Coral Reef Watch bleaching prediction method to the output of a low- and high-climate sensitivity GCM. First, we develop and test algorithms for predicting mass coral bleaching with GCM-resolution sea surface temperatures for thousands of coral reefs, using a global coral reef map and 1985-2002 bleaching prediction data. We then use the algorithms to determine the frequency of coral bleaching and required thermal adaptation by corals and their endosymbionts under two different emissions scenarios. The results indicate that bleaching could become an annual or biannual event for the vast majority of the world's coral reefs in the next 30-50 years without an increase in thermal tolerance of 0.2-1.0 degrees C per decade. The geographic variability in required thermal adaptation found in each model and emissions scenario suggests that coral reefs in some regions, like Micronesia and western Polynesia, may be particularly vulnerable to climate change. Advances in modelling and monitoring will refine the forecast for individual reefs, but this assessment concludes that the global prognosis is unlikely to change without an accelerated effort to stabilize atmospheric greenhouse gas concentrations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A study of information available on the settlement characteristics of backfill in restored opencast coal mining sites and other similar earthworks projects has been undertaken. In addition, the methods of opencast mining, compaction controls, monitoring and test methods have been reviewed. To consider and develop the methods of predicting the settlement of fill, three sites in the West Midlands have been examined; at each, the backfill had been placed in a controlled manner. In addition, use has been made of a finite element computer program to compare a simple two-dimensional linear elastic analysis with field observations of surface settlements in the vicinity of buried highwalls. On controlled backfill sites, settlement predictions have been accurately made, based on a linear relationship between settlement (expressed as a percentage of fill height) against logarithm of time. This `creep' settlement was found to be effectively complete within 18 months of restoration. A decrease of this percentage settlement was observed with increasing fill thickness; this is believed to be related to the speed with which the backfill is placed. A rising water table within the backfill is indicated to cause additional gradual settlement. A prediction method, based on settlement monitoring, has been developed and used to determine the pattern of settlement across highwalls and buried highwalls. The zone of appreciable differential settlement was found to be mainly limited to the highwall area, the magnitude was dictated by the highwall inclination. With a backfill cover of about 15 metres over a buried highwall the magnitude of differential settlement was negligible. Use has been made of the proposed settlement prediction method and monitoring to control the re-development of restored opencase sites. The specifications, tests and monitoring techniques developed in recent years have been used to aid this. Such techniques have been valuable in restoring land previously derelict due to past underground mining.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An application of the heterogeneous variables system prediction method to solving the time series analysis problem with respect to the sample size is considered in this work. It is created a logical-and-probabilistic correlation from the logical decision function class. Two ways is considered. When the information about event is kept safe in the process, and when it is kept safe in depending process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Veterinary medicines (VMs) from agricultural industry can enter the environment in a number of ways. This includes direct exposure through aquaculture, accidental spillage and disposal, and indirect entry by leaching from manure or runoff after treatment. Many compounds used in animal treatments have ecotoxic properties that may have chronic or sometimes lethal effects when they come into contact with non-target organisms. VMs enter the environment in mixtures, potentially having additive effects. Traditional ecotoxicology tests are used to determine the lethal and sometimes reproductive effects on freshwater and terrestrial organisms. However, organisms used in ecotoxicology tests can be unrepresentative of the populations that are likely to be exposed to the compound in the environment. Most often the tests are on single compound toxicity but mixture effects may be significant and should be included in ecotoxicology testing. This work investigates the use, measured environmental concentrations (MECs) and potential impact of sea lice treatments on salmon farms in Scotland. Alternative methods for ecotoxicology testing including mixture toxicity, and the use of in silico techniques to predict the chronic impact of VMs on different species of aquatic organisms were also investigated. The Scottish Environmental Protection Agency (SEPA) provided information on the use of five sea lice treatments from 2008-2011 on Scottish salmon farms. This information was combined with the recently available data on sediment MECs for the years 2009-2012 provided by SEPA using ArcGIS 10.1. In depth analysis of this data showed that from a total of 55 sites, 30 sites had a MEC higher than the maximum allowable concentration (MAC) as set out by SEPA for emamectin benzoate and 7 sites had a higher MEC than MAC for teflubenzuron. A number of sites that were up to 16 km away from the nearest salmon farm reported as using either emamectin benzoate or teflubenzuron measured positive for the two treatments. There was no relationship between current direction and the distribution of the sea lice treatments, nor was there any evidence for alternative sources of the compounds e.g. land treatments. The sites that had MECs higher than the MAC could pose a risk to non-target organisms and disrupt the species dynamics of the area. There was evidence that some marine protected sites might be at risk of exposure to these compounds. To complement this work, effects on acute mixture toxicity of the 5 sea lice treatments, plus one major metabolite 3-phenoxybenzoic acid (3PBA), were measured using an assay using the bioluminescent bacteria Aliivibrio fischeri. When exposed to the 5 sea lice treatments and 3PBA A. fischeri showed a response to 3PBA, emamectin benzoate and azamethiphos as well as combinations of the three. In order to establish any additive effect of the sea lice treatments, the efficacy of two mixture prediction equations, concentration addition (CA) and independent action ii(IA) were tested using the results from single compound dose response curves. In this instance IA was the more effective prediction method with a linear regression confidence interval of 82.6% compared with 22.6% of CA. In silico molecular docking was carried out to predict the chronic effects of 15 VMs (including the five used as sea lice control). Molecular docking has been proposed as an alternative screening method for the chronic effects of large animal treatments on non-target organisms. Oestrogen receptor alpha (ERα) of 7 non-target bony fish and the African clawed frog Xenopus laevis were modelled using SwissModel. These models were then ‘docked’ to oestradiol, the synthetic oestrogen ethinylestradiol, two known xenoestrogens dichlorodiphenyltrichloroethane (DDT) and bisphenol A (BPA), the antioestrogen breast cancer treatment tamoxifen and 15 VMs using Auto Dock 4. Based on the results of this work, four VMs were identified as being possible xenoestrogens or anti-oestrogens; these were cypermethrin, deltamethrin, fenbendazole and teflubenzuron. Further investigation, using in vitro assays, into these four VMs has been suggested as future work. A modified recombinant yeast oestrogen screen (YES) was attempted using the cDNA of the ERα of the zebrafish Danio rerio and the rainbow trout Oncorhynchus mykiss. Due to time and difficulties in cloning protocols this work was unable to be completed. Use of such in vitro assays would allow for further investigation of the highlighted VMs into their oestrogenic potential. In conclusion, VMs used as sea lice treatments, such as teflubenzuron and emamectin benzoate may be more persistent and have a wider range in the environment than previously thought. Mixtures of sea lice treatments have been found to persist together in the environment, and effects of these mixtures on the bacteria A. fischeri can be predicted using the IA equation. Finally, molecular docking may be a suitable tool to predict chronic endocrine disrupting effects and identify varying degrees of impact on the ERα of nine species of aquatic organisms.