979 resultados para cost estimating tools


Relevância:

40.00% 40.00%

Publicador:

Resumo:

"July 1948."

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We explore the recently developed snapshot-based dynamic mode decomposition (DMD) technique, a matrix-free Arnoldi type method, to predict 3D linear global flow instabilities. We apply the DMD technique to flows confined in an L-shaped cavity and compare the resulting modes to their counterparts issued from classic, matrix forming, linear instability analysis (i.e. BiGlobal approach) and direct numerical simulations. Results show that the DMD technique, which uses snapshots generated by a 3D non-linear incompressible discontinuous Galerkin Navier?Stokes solver, provides very similar results to classical linear instability analysis techniques. In addition, we compare DMD results issued from non-linear and linearised Navier?Stokes solvers, showing that linearisation is not necessary (i.e. base flow not required) to obtain linear modes, as long as the analysis is restricted to the exponential growth regime, that is, flow regime governed by the linearised Navier?Stokes equations, and showing the potential of this type of analysis based on snapshots to general purpose CFD codes, without need of modifications. Finally, this work shows that the DMD technique can provide three-dimensional direct and adjoint modes through snapshots provided by the linearised and adjoint linearised Navier?Stokes equations advanced in time. Subsequently, these modes are used to provide structural sensitivity maps and sensitivity to base flow modification information for 3D flows and complex geometries, at an affordable computational cost. The information provided by the sensitivity study is used to modify the L-shaped geometry and control the most unstable 3D mode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Tuberculosis is one of the most prominent health problems in the world, causing 1.75 million deaths each year. Rapid clinical diagnosis is important in patients who have comorbidities such as Human Immunodeficiency Virus (HIV) infection. Direct microscopy has low sensitivity and culture takes 3 to 6 weeks [1-3]. Therefore, new tools for TB diagnosis are necessary, especially in health settings with a high prevalence of HIV/TB co-infection. Methods: In a public reference TB/HIV hospital in Brazil, we compared the cost-effectiveness of diagnostic strategies for diagnosis of pulmonary TB: Acid fast bacilli smear microscopy by Ziehl-Neelsen staining (AFB smear) plus culture and AFB smear plus colorimetric test (PCR dot-blot). From May 2003 to May 2004, sputum was collected consecutively from PTB suspects attending the Parthenon Reference Hospital. Sputum samples were examined by AFB smear, culture, and PCR dot-blot. The gold standard was a positive culture combined with the definition of clinical PTB. Cost analysis included health services and patient costs. Results: The AFB smear plus PCR dot-blot require the lowest laboratory investment for equipment (US$ 20,000). The total screening costs are 3.8 times for AFB smear plus culture versus for AFB smear plus PCR dot blot costs (US$ 5,635,760 versus US$ 1,498, 660). Costs per correctly diagnosed case were US$ 50,773 and US$ 13,749 for AFB smear plus culture and AFB smear plus PCR dot-blot, respectively. AFB smear plus PCR dot-blot was more cost-effective than AFB smear plus culture, when the cost of treating all correctly diagnosed cases was considered. The cost of returning patients, which are not treated due to a negative result, to the health service, was higher in AFB smear plus culture than for AFB smear plus PCR dot-blot, US$ 374,778,045 and US$ 110,849,055, respectively. Conclusion: AFB smear associated with PCR dot-blot associated has the potential to be a cost-effective tool in the fight against PTB for patients attended in the TB/HIV reference hospital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an energy perspective of cost-reduction and configuration-optimization, it becomes necessary to develop and use advanced tools for the analysis, design and improvement of energy conversion systems. In the aeronautical industry, such trend is fundamental since this industry has evolved to design extremely complex aircrafts, with highly integrated systems, requiring more information in order to evaluate the whole system. The aim of this paper is to present an exergy-based analysis as to evaluate the global performance of a typical turbofan engine and its components. The study presents values for exergy efficiency over the whole flight cycle, critical equipment and flight phases considering exergy destruction and estimating internal and exhaust flow costs. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method of estimating the economic value of life is proposed. Using cross-country data, an equation is estimated to explain life expectancy as a function of real consumption of goods and services. The associated cost function for life expectancy in terms of the prices of specific goods and services is used to estimate the cost of a reduction in age-specific mortality rates sufficient to save the life of one person. The cost of saving a life in OECD countries is as much as 1000 times that in the poorest countries. Ethical implications are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the direct costs of schizophrenia for the public sector. METHODS: A study was carried out in the state of São Paulo, Brazil, during 1998. Data from the medical literature and governmental research bodies were gathered for estimating the total number of schizophrenia patients covered by the Brazilian Unified Health System. A decision tree was built based on an estimated distribution of patients under different types of psychiatric care. Medical charts from public hospitals and outpatient services were used to estimate the resources used over a one-year period. Direct costs were calculated by attributing monetary values for each resource used. RESULTS: Of all patients, 81.5% were covered by the public sector and distributed as follows: 6.0% in psychiatric hospital admissions, 23.0% in outpatient care, and 71.0% without regular treatment. The total direct cost of schizophrenia was US$191,781,327 (2.2% of the total health care expenditure in the state). Of this total, 11.0% was spent on outpatient care and 79.2% went for inpatient care. CONCLUSIONS: Most schizophrenia patients in the state of São Paulo receive no regular treatment. The study findings point out to the importance of investing in research aimed at improving the resource allocation for the treatment of mental disorders in Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation presented to obtain a Ph.D. degree in Engineering and Technology Sciences, Biotechnology at the Instituto de Tecnologia Química e Biológica, Universidade Nova de Lisboa

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.