938 resultados para PM3 semi-empirical method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The synthesis of helium in the early Universe depends on many input parameters, including the value of the gravitational coupling during the period when the nucleosynthesis takes place. We compute the primordial abundance of helium as function of the gravitational coupling, using a semi-analytical method, in order to track the influence of G in the primordial nucleosynthesis. To be specific, we construct a cosmological model with varying G, using the Brans-Dicke theory. The greater the value of G at nucleosynthesis period, the greater the predicted abundance of helium. Using the observational data for the abundance of primordial helium, constraints for the time variation of G are established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Conservação e Restauro, especialidade Teoria, História e Técnicas, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Civil na Área de especialização em Hidráulica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plasmodium falciparum resistant strain development has encouraged the search for new antimalarial drugs. Febrifugine is a natural substance with high activity against P. falciparum presenting strong emetic property and liver toxicity, which prevent it from being used as a clinical drug. The search for analogues that could have a better clinical performance is a current topic. We aim to investigate the theoretical electronic structure by means of febrifugine derivative family semi-empirical molecular orbital calculations, seeking the electronic indexes that could help the design of new efficient derivatives. The theoretical results show there is a clustering in well-defined ranges of several electronic indexes of the most selective molecules. The model proposed for achieving high selectivity was tested with success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diesel Engine, Soot- and NOx-Emissions, Potency Product Approaches, zero-dimensional Models, optical and thermodynamical Methods, semi-empirical, Combustion Strategy, Injection Rate Shape

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to identify the role of memory as a screening device in repeated contracts with asymmetric information in financial intermediation. We use an original dataset from the European Bank for Reconstruction and Development. We propose a simple empirical method to capture the role of memory using the client's reputation. Our results unambiguously isolate the dominant effect of memory on the bank's lending decisions over market factors in the case of established clients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hem realitzat l’estudi de moviments humans i hem buscat la forma de poder crear aquests moviments en temps real sobre entorns digitals de forma que la feina que han de dur a terme els artistes i animadors sigui reduïda. Hem fet un estudi de les diferents tècniques d’animació de personatges que podem trobar actualment en l’industria de l’entreteniment així com les principals línies de recerca, estudiant detingudament la tècnica més utilitzada, la captura de moviments. La captura de moviments permet enregistrar els moviments d’una persona mitjançant sensors òptics, sensors magnètics i vídeo càmeres. Aquesta informació és emmagatzemada en arxius que després podran ser reproduïts per un personatge en temps real en una aplicació digital. Tot moviment enregistrat ha d’estar associat a un personatge, aquest és el procés de rigging, un dels punts que hem treballat ha estat la creació d’un sistema d’associació de l’esquelet amb la malla del personatge de forma semi-automàtica, reduint la feina de l’animador per a realitzar aquest procés. En les aplicacions en temps real com la realitat virtual, cada cop més s’està simulant l’entorn en el que viuen els personatges mitjançant les lleis de Newton, de forma que tot canvi en el moviment d’un cos ve donat per l’aplicació d’una força sobre aquest. La captura de moviments no escala bé amb aquests entorns degut a que no és capaç de crear noves animacions realistes a partir de l’enregistrada que depenguin de l’interacció amb l’entorn. L’objectiu final del nostre treball ha estat realitzar la creació d’animacions a partir de forces tal i com ho fem en la realitat en temps real. Per a això hem introduït un model muscular i un sistema de balanç sobre el personatge de forma que aquest pugui respondre a les interaccions amb l’entorn simulat mitjançant les lleis de Newton de manera realista.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Vertebral fracture is one of the major osteoporotic fractures which are unfortunately very often undetected. In addition, it is well known that prevalent vertebral fracture increases dramatically the risk of future additional fracture. Instant Vertebral Assessment (IVA) has been introduced in DXA device couple years ago to ease the detection of such fracture when routine DXA are performed. To correctly use such tool, ISCD provided clinical recommendation on when and how to use it. The aim of our study was to evaluate the ISCD guidelines in clinical routine patients and see how often it may change of patient management. Methods: During two months (March and April 2010), a medical questionnaire was systematically given to our clinical routine patient to check the validity of ISCD IVA recommendations in our population. In addition, all women had BMD measurement at AP spine, Femur and 1/3 radius using a Discovery A System (Hologic, Waltham, USA). When appropriate, IVA measurement had been performed on the same DXA system and had been centrally evaluated by two trained Doctors for fracture status according to the semi-quantitative method of Genant. The reading had been performed when possible between L5 and T4. Results: Out of 210 women seen in the consultation, 109 (52%) of them (mean age 68.2 ± 11.5 years) fulfilled the necessary criteria to have an IVA measurement. Out of these 109 women, 43 (incidence 39.4%) had osteoporosis at one of the three skeletal sites and 31 (incidence 28.4%) had at least one vertebral fracture. 14.7% of women had both osteoporosis and at least one vertebral fracture classifying them as "severe osteoporosis" while 46.8% did not have osteoporosis nor vertebral fracture. 24.8% of the women had osteoporosis but no vertebral fracture while 13.8% of women did have osteoporosis and vertebral fracture (clinical osteoporosis). Conclusion: In conclusion, in 52% of our patients, IVA was needed according to ISCD criteria. In half of them the IVA test influenced of patient management either by changing the type of treatment of simply by classifying patient as "clinical osteoporosis". IVA appears to be an important tool in clinical routine but unfortunately is not yet very often used in most of the centers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vertebral fracture is one of the major osteoporotic fractures which are unfortunately very often undetected. In addition, it is well known that prevalent vertebral fracture increases dramatically the risk of future additional fracture. Instant Vertebral Assessment (IVA) has been introduced in DXA device couple years ago to ease the detection of such fracture when routine DXA are performed. To correctly use such tool, ISCD provided clinical recommendation on when and how to use it. The aim of our study was to evaluate the ISCD guidelines in clinical routine patients and see how often it may change of patient management. During two months (March and April 2010), a medical questionnaire was systematically given to our clinical routine patient to check the validity of ISCD IVA recommendations in our population. In addition, all women had BMD measurement at AP spine, Femur and 1/3 radius using a Discovery A System (Hologic, Waltham, USA). When appropriate, IVA measurement had been performed on the same DXA system and had been centrally evaluated by two trained Doctors for fracture status according to the semi-quantitative method of Genant. The reading had been performed when possible between L5 and T4. Out of 210 women seen in the consultation, 109 (52%) of them (mean age 68.2±11.5 years) fulfilled the necessary criteria to have an IVA measurement. Out of these 109 women, 43 (incidence 39.4%) had osteoporosis at one of the three skeletal sites and 31 (incidence 28.4%) had at least one vertebral fracture. 14.7% of women had both osteoporosis and at least one vertebral fracture classifying them as "severe osteoporosis" while 46.8% did not have osteoporosis not vertebral fracture. 24.8% of the women had osteoporosis but no vertebral fracture while 13.8% of women did have osteoporosis but vertebral fracture (Clinical osteoporosis). In conclusion, in 52% of our patients, IVA was needed according to ISCD criteria. In half of them the IVA test influenced of patient management either my changing the type of treatment of simply by classifying patient as "clinical osteoporosis". IVA appears to be an important tool in clinical routine but unfortunately is not yet very often use in most of the centers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pulse-wave velocity (PWV) is considered as the gold-standard method to assess arterial stiffness, an independent predictor of cardiovascular morbidity and mortality. Current available devices that measure PWV need to be operated by skilled medical staff, thus, reducing the potential use of PWV in the ambulatory setting. In this paper, we present a new technique allowing continuous, unsupervised measurements of pulse transit times (PTT) in central arteries by means of a chest sensor. This technique relies on measuring the propagation time of pressure pulses from their genesis in the left ventricle to their later arrival at the cutaneous vasculature on the sternum. Combined thoracic impedance cardiography and phonocardiography are used to detect the opening of the aortic valve, from which a pre-ejection period (PEP) value is estimated. Multichannel reflective photoplethysmography at the sternum is used to detect the distal pulse-arrival time (PAT). A PTT value is then calculated as PTT = PAT - PEP. After optimizing the parameters of the chest PTT calculation algorithm on a nine-subject cohort, a prospective validation study involving 31 normo- and hypertensive subjects was performed. 1/chest PTT correlated very well with the COMPLIOR carotid to femoral PWV (r = 0.88, p < 10 (-9)). Finally, an empirical method to map chest PTT values onto chest PWV values is explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.