979 resultados para cost estimating tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

66 p.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work thesis focuses on the Helicon Plasma Thruster (HPT) as a candidate for generating thrust for small satellites and CubeSats. Two main topics are addressed: the development of a Global Model (GM) and a 3D self-consistent numerical tool. The GM is suitable for preliminary analysis of HPTs with noble gases such as argon, neon, krypton, and xenon, and alternative propellants such as air and iodine. A lumping methodology is developed to reduce the computational cost when modelling the excited species in the plasma chemistry. A 3D self-consistent numerical tool is also developed that can treat discharges with a generic 3D geometry and model the actual plasma-antenna coupling. The tool consists of two main modules, an EM module and a FLUID module, which run iteratively until a steady state solution is converged. A third module is available for solving the plume with a simplified semi-analytical approach, a PIC code, or directly by integration of the fluid equations. Results obtained from both the numerical tools are benchmarked against experimental measures of HPTs or Helicon reactors, obtaining very good qualitative agreement with the experimental trend for what concerns the GM, and an excellent agreement of the physical trends predicted against the measured data for the 3D numerical strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides a short review of quantum dots (QDs) physics, applications, and perspectives. The main advantage of QDs over bulk semiconductors is the fact that the size became a control parameter to tailor the optical properties of new materials. Size changes the confinement energy which alters the optical properties of the material, such as absorption, refractive index, and emission bands. Therefore, by using QDs one can make several kinds of optical devices. One of these devices transforms electrons into photons to apply them as active optical components in illumination and displays. Other devices enable the transformation of photons into electrons to produce QDs solar cells or photodetectors. At the biomedical interface, the application of QDs, which is the most important aspect in this book, is based on fluorescence, which essentially transforms photons into photons of different wavelengths. This chapter introduces important parameters for QDs' biophotonic applications such as photostability, excitation and emission profiles, and quantum efficiency. We also present the perspectives for the use of QDs in fluorescence lifetime imaging (FLIM) and Förster resonance energy transfer (FRET), so useful in modern microscopy, and how to take advantage of the usually unwanted blinking effect to perform super-resolution microscopy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratory tests are essential for accurate diagnosis and cost-effective management of thyroid disorders. When the clinical suspicion is strong, hormonal levels just confirms the diagnosis. However, in most patients, symptoms are subtle and unspecific, so that only biochemical tests can detect the disorder. The objective of this article is to do a critical analysis of the appropriate use of the most important thyroid function tests, including serum concentrations of thyrotropin (TSH), thyroid hormones and antithyroid antibodies. Through a survey in the MedLine database, we discuss the major pitfalls and interferences related to daily use of these tests and recommendations are presented to optimize the use of these diagnostic tools in clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantification of the available energy in the environment is important because it determines photosynthesis, evapotranspiration and, therefore, the final yield of crops. Instruments for measuring the energy balance are costly and indirect estimation alternatives are desirable. This study assessed the Deardorff's model performance during a cycle of a sugarcane crop in Piracicaba, State of São Paulo, Brazil, in comparison to the aerodynamic method. This mechanistic model simulates the energy fluxes (sensible, latent heat and net radiation) at three levels (atmosphere, canopy and soil) using only air temperature, relative humidity and wind speed measured at a reference level above the canopy, crop leaf area index, and some pre-calibrated parameters (canopy albedo, soil emissivity, atmospheric transmissivity and hydrological characteristics of the soil). The analysis was made for different time scales, insolation conditions and seasons (spring, summer and autumn). Analyzing all data of 15 minute intervals, the model presented good performance for net radiation simulation in different insolations and seasons. The latent heat flux in the atmosphere and the sensible heat flux in the atmosphere did not present differences in comparison to data from the aerodynamic method during the autumn. The sensible heat flux in the soil was poorly simulated by the model due to the poor performance of the soil water balance method. The Deardorff's model improved in general the flux simulations in comparison to the aerodynamic method when more insolation was available in the environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to develop and validate equations to estimate the aboveground phytomass of a 30 years old plot of Atlantic Forest. In two plots of 100 m², a total of 82 trees were cut down at ground level. For each tree, height and diameter were measured. Leaves and woody material were separated in order to determine their fresh weights in field conditions. Samples of each fraction were oven dried at 80 °C to constant weight to determine their dry weight. Tree data were divided into two random samples. One sample was used for the development of the regression equations, and the other for validation. The models were developed using single linear regression analysis, where the dependent variable was the dry mass, and the independent variables were height (h), diameter (d) and d²h. The validation was carried out using Pearson correlation coefficient, paired t-Student test and standard error of estimation. The best equations to estimate aboveground phytomass were: lnDW = -3.068+2.522lnd (r² = 0.91; s y/x = 0.67) and lnDW = -3.676+0.951ln d²h (r² = 0.94; s y/x = 0.56).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies have shown that both carbon dioxide (CO2) and octenol (1-octen-3-ol) are effective attractants for mosquitoes. The objective of the present study was to evaluate the attractiveness of 1-octen-3-ol and CO2 for diurnal mosquitoes in the southeastern Atlantic forest. A Latin square experimental design was employed with four treatments: CDC-light trap (CDC-LT), CDC-LT and 1-octen-3-ol, CDC-LT and CO2 and CDC-LT with 1-octen-3-ol and CO2. Results demonstrated that both CDC-CO2 and CDC-CO2-1-octen-3-ol captured a greater number of mosquito species and specimens compared to CDC-1-octen-3-ol; CDC-LT was used as the control. Interestingly, Anopheles (Kerteszia) sp. was generally attracted to 1-octen-3-ol, whereas Aedes serratus was the most abundant species in all Latin square collections. This species was recently shown to be competent to transmit the yellow fever virus and may therefore play a role as a disease vector in rural areas of Brazil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The network of HIV counseling and testing centers in São Paulo, Brazil is a major source of data used to build epidemiological profiles of the client population. We examined HIV-1 incidence from November 2000 to April 2001, comparing epidemiological and socio-behavioral data of recently-infected individuals with those with long-standing infection. A less sensitive ELISA was employed to identify recent infection. The overall incidence of HIV-1 infection was 0.53/100/year (95% CI: 0.31-0.85/100/year): 0.77/100/year for males (95% CI: 0.42-1.27/100/year) and 0.22/100/ year (95% CI: 0.05-0.59/100/year) for females. Overall HIV-1 prevalence was 3.2% (95% CI: 2.8-3.7%), being 4.0% among males (95% CI: 3.3-4.7%) and 2.1% among females (95% CI: 1.6-2.8%). Recent infections accounted for 15% of the total (95% CI: 10.2-20.8%). Recent infection correlated with being younger and male (p = 0.019). Therefore, recent infection was more common among younger males and older females.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medium density fiberboard (MDF) is an engineered wood product formed by breaking down selected lignin-cellulosic material residuals into fibers, combining it with wax and a resin binder, and then forming panels by applying high temperature and pressure. Because the raw material in the industrial process is ever-changing, the panel industry requires methods for monitoring the composition of their products. The aim of this study was to estimate the ratio of sugarcane (SC) bagasse to Eucalyptus wood in MDF panels using near infrared (NIR) spectroscopy. Principal component analysis (PCA) and partial least square (PLS) regressions were performed. MDF panels having different bagasse contents were easily distinguished from each other by the PCA of their NIR spectra with clearly different patterns of response. The PLS-R models for SC content of these MDF samples presented a strong coefficient of determination (0.96) between the NIR-predicted and Lab-determined values and a low standard error of prediction (similar to 1.5%) in the cross-validations. A key role of resins (adhesives), cellulose, and lignin for such PLS-R calibrations was shown. PLS-DA model correctly classified ninety-four percent of MDF samples by cross-validations and ninety-eight percent of the panels by independent test set. These NIR-based models can be useful to quickly estimate sugarcane bagasse vs. Eucalyptus wood content ratio in unknown MDF samples and to verify the quality of these engineered wood products in an online process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Brazilian consensus recommends a short-term treatment course with clarithromycin, amoxicillin and proton-pump inhibitor for the eradication of Helicobacter pylori ( H. pylori). This treatment course has good efficacy, but cannot be afforded by a large part of the population. Azithromycin, amoxicillin and omeprazole are subsidized, for several aims, by the Brazilian federal government. Therefore, a short-term treatment course that uses these drugs is a low-cost one, but its efficacy regarding the bacterium eradication is yet to be demonstrated. The study's purpose was to verify the efficacy of H. pylori eradication in infected patients who presented peptic ulcer disease, using the association of azithromycin, amoxicillin and omeprazole. Methods: Sixty patients with peptic ulcer diagnosed by upper digestive endoscopy and H. pylori infection documented by rapid urease test, histological analysis and urea breath test were treated for six days with a combination of azithromycin 500 mg and omeprazole 20 mg, in a single daily dose, associated with amoxicillin 500 mg 3 times a day. The eradication control was carried out 12 weeks after the treatment by means of the same diagnostic tests. The eradication rates were calculated with 95% confidence interval. Results: The eradication rate was 38% per intention to treat and 41% per protocol. Few adverse effects were observed and treatment compliance was high. Conclusion: Despite its low cost and high compliance, the low eradication rate does not allow the recommendation of the triple therapy with azithromycin as an adequate treatment for H. pylori infection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to Brazilian National Data Survey diabetes is the fifth cause for hospitalization and is one of the ten major causes of mortality in this country. Aims to stratify the estimated cardiovascular risk (eCVR) in a population of type 2 diabetics (T2DM) according to the Framingham prediction equations as well as to determine the association between eCVR with metabolic and clinical control of the disease. Methods From 2000 to 2001 a cross-sectional multicenter study was conducted in 13 public out-patients diabetes/endocrinology clinics from 8 Brazilian cities. The 10-year risk of developing coronary heart disease (CHD) was estimated by the prediction equations described by Wilson et al (Circulation 1998). LDL equations were preferably used; when patients missed LDL data we used total cholesterol equations instead. Results Data from 1382 patients (59.0% female) were analyzed. Median and inter-quartile range (IQ) of age and duration of diabetes were 57.4 (51-65) and 8.8 (3-13) years, respectively without differences according to the gender. Forty-two percent of these patients were overweight and 35.4% were obese (the prevalence of higher BMI and obesity in this T2DM group was significantly higher in women than in men; p < 0.001). The overall estimated eCVR in T2DM patients was 21.4 (13.5-31.3). The eCVR was high (> 20%) in 738 (53.4%), intermediate in 202 (14.6%) and low in 442 (32%) patients. Men [25.1(15.4-37.3)] showed a higher eCVR than women [18.8 (12.4-27.9) p < 0.001]. The most common risk factor was high LDL-cholesterol (80.8%), most frequently found in women than in men (p = 0.01). The median of risk factors present was three (2-4) without gender differences. Overall we observed that 60 (4.3%) of our patients had none, 154(11.1%) one, 310 (22.4%) two, 385 (27.9%) three, 300 (21.7%) four, 149 (10.5%) five and six, (2%) six risk factors. A higher eCVR was noted in overweight or obese patients (p = 0.01 for both groups). No association was found between eCVR with age or a specific type of diabetes treatment. A correlation was found between eCVR and duration of diabetes (p < 0.001), BMI (p < 0.001), creatinine (p < 0.001) and triglycerides levels (p < 0.001) but it was not found with HbA1c, fasting blood glucose and postprandial glucose. A higher eCVR was observed in patients with retinopathy (p < 0.001) and a tendency in patients with microalbuminuria (p = 0.06). Conclusion: our study showed that in this group of Brazilian T2DM the eCVR was correlated with the lipid profile and it was higher in patients with microvascular chronic complications. No correlation was found with glycemic control parameters. These data could explain the failure of intensive glycemic control programs aiming to reduce cardiovascular events observed in some studies.