904 resultados para Optimisation of methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. The surgical treatment of dysfunctional hips is a severe condition for the patient and a costly therapy for the public health. Hip resurfacing techniques seem to hold the promise of various advantages over traditional THR, with particular attention to young and active patients. Although the lesson provided in the past by many branches of engineering is that success in designing competitive products can be achieved only by predicting the possible scenario of failure, to date the understanding of the implant quality is poorly pre-clinically addressed. Thus revision is the only delayed and reliable end point for assessment. The aim of the present work was to model the musculoskeletal system so as to develop a protocol for predicting failure of hip resurfacing prosthesis. Methods. Preliminary studies validated the technique for the generation of subject specific finite element (FE) models of long bones from Computed Thomography data. The proposed protocol consisted in the numerical analysis of the prosthesis biomechanics by deterministic and statistic studies so as to assess the risk of biomechanical failure on the different operative conditions the implant might face in a population of interest during various activities of daily living. Physiological conditions were defined including the variability of the anatomy, bone densitometry, surgery uncertainties and published boundary conditions at the hip. The protocol was tested by analysing a successful design on the market and a new prototype of a resurfacing prosthesis. Results. The intrinsic accuracy of models on bone stress predictions (RMSE < 10%) was aligned to the current state of the art in this field. The accuracy of prediction on the bone-prosthesis contact mechanics was also excellent (< 0.001 mm). The sensitivity of models prediction to uncertainties on modelling parameter was found below 8.4%. The analysis of the successful design resulted in a very good agreement with published retrospective studies. The geometry optimisation of the new prototype lead to a final design with a low risk of failure. The statistical analysis confirmed the minimal risk of the optimised design over the entire population of interest. The performances of the optimised design showed a significant improvement with respect to the first prototype (+35%). Limitations. On the authors opinion the major limitation of this study is on boundary conditions. The muscular forces and the hip joint reaction were derived from the few data available in the literature, which can be considered significant but hardly representative of the entire variability of boundary conditions the implant might face over the patients population. This moved the focus of the research on modelling the musculoskeletal system; the ongoing activity is to develop subject-specific musculoskeletal models of the lower limb from medical images. Conclusions. The developed protocol was able to accurately predict known clinical outcomes when applied to a well-established device and, to support the design optimisation phase providing important information on critical characteristics of the patients when applied to a new prosthesis. The presented approach does have a relevant generality that would allow the extension of the protocol to a large set of orthopaedic scenarios with minor changes. Hence, a failure mode analysis criterion can be considered a suitable tool in developing new orthopaedic devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first phase of the research activity has been related to the study of the state of art of the infrastructures for cycling, bicycle use and methods for evaluation. In this part, the candidate has studied the "bicycle system" in countries with high bicycle use and in particular in the Netherlands. Has been carried out an evaluation of the questionnaires of the survey conducted within the European project BICY on mobility in general in 13 cities of the participating countries. The questionnaire was designed, tested and implemented, and was later validated by a test in Bologna. The results were corrected with information on demographic situation and compared with official data. The cycling infrastructure analysis was conducted on the basis of information from the OpenStreetMap database. The activity consisted in programming algorithms in Python that allow to extract data from the database infrastructure for a region, to sort and filter cycling infrastructure calculating some attributes, such as the length of the arcs paths. The results obtained were compared with official data where available. The structure of the thesis is as follows: 1. Introduction: description of the state of cycling in several advanced countries, description of methods of analysis and their importance to implement appropriate policies for cycling. Supply and demand of bicycle infrastructures. 2. Survey on mobility: it gives details of the investigation developed and the method of evaluation. The results obtained are presented and compared with official data. 3. Analysis cycling infrastructure based on information from the database of OpenStreetMap: describes the methods and algorithms developed during the PhD. The results obtained by the algorithms are compared with official data. 4. Discussion: The above results are discussed and compared. In particular the cycle demand is compared with the length of cycle networks within a city. 5. Conclusions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinicians find standardized mean differences (SMDs) calculated from continuous outcomes difficult to interpret. Our objective was to determine the performance of methods in converting SMDs or means to odds ratios of treatment response and numbers needed to treat (NNTs) as more intuitive measures of treatment effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We derive a new class of iterative schemes for accelerating the convergence of the EM algorithm, by exploiting the connection between fixed point iterations and extrapolation methods. First, we present a general formulation of one-step iterative schemes, which are obtained by cycling with the extrapolation methods. We, then square the one-step schemes to obtain the new class of methods, which we call SQUAREM. Squaring a one-step iterative scheme is simply applying it twice within each cycle of the extrapolation method. Here we focus on the first order or rank-one extrapolation methods for two reasons, (1) simplicity, and (2) computational efficiency. In particular, we study two first order extrapolation methods, the reduced rank extrapolation (RRE1) and minimal polynomial extrapolation (MPE1). The convergence of the new schemes, both one-step and squared, is non-monotonic with respect to the residual norm. The first order one-step and SQUAREM schemes are linearly convergent, like the EM algorithm but they have a faster rate of convergence. We demonstrate, through five different examples, the effectiveness of the first order SQUAREM schemes, SqRRE1 and SqMPE1, in accelerating the EM algorithm. The SQUAREM schemes are also shown to be vastly superior to their one-step counterparts, RRE1 and MPE1, in terms of computational efficiency. The proposed extrapolation schemes can fail due to the numerical problems of stagnation and near breakdown. We have developed a new hybrid iterative scheme that combines the RRE1 and MPE1 schemes in such a manner that it overcomes both stagnation and near breakdown. The squared first order hybrid scheme, SqHyb1, emerges as the iterative scheme of choice based on our numerical experiments. It combines the fast convergence of the SqMPE1, while avoiding near breakdowns, with the stability of SqRRE1, while avoiding stagnations. The SQUAREM methods can be incorporated very easily into an existing EM algorithm. They only require the basic EM step for their implementation and do not require any other auxiliary quantities such as the complete data log likelihood, and its gradient or hessian. They are an attractive option in problems with a very large number of parameters, and in problems where the statistical model is complex, the EM algorithm is slow and each EM step is computationally demanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant cost for foundations is the design and installation of piles when they are required due to poor ground conditions. Not only is it important that piles be designed properly, but also that the installation equipment and total cost be evaluated. To assist in the evaluation of piles a number of methods have been developed. In this research three of these methods were investigated, which were developed by the Federal Highway Administration, the US Corps of Engineers and the American Petroleum Institute (API). The results from these methods were entered into the program GRLWEAPTM to assess the pile drivability and to provide a standard base for comparing the three methods. An additional element of this research was to develop EXCEL spreadsheets to implement these three methods. Currently the Army Corps and API methods do not have publicly available software and must be performed manually, which requires that data is taken off of figures and tables, which can introduce error in the prediction of pile capacities. Following development of the EXCEL spreadsheet, they were validated with both manual calculations and existing data sets to ensure that the data output is correct. To evaluate the three pile capacity methods data was utilized from four project sites from North America. The data included site geotechnical data along with field determined pile capacities. In order to achieve a standard comparison of the data, the pile capacities and geotechnical data from the three methods were entered into GRLWEAPTM. The sites consisted of both cohesive and cohesionless soils; where one site was primarily cohesive, one was primarily cohesionless, and the other two consisted of inter-bedded cohesive and cohesionless soils. Based on this limited set of data the results indicated that the US Corps of Engineers method more closely compared with the field test data, followed by the API method to a lesser degree. The DRIVEN program compared favorably in cohesive soils, but over predicted in cohesionless material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Prognostic classification of congestive heart failure (CHF) is difficult and only possible with the help of additional diagnostic tools. B-type natriuretic peptide (BNP) has been used as a diagnostic and prognostic marker for patients (pts) with CHF. In this study, the clinical value of BNP for stratification and treatment of pts with CHF was evaluated. PATIENTS AND METHODS 33 out-pts with CHF (age 57 +/- 12 years) were included. Left-ventricular (LV) ejection fraction (EF) was 27 +/- 8% (mean +/- SD) and NYHA-class 2.4 +/- 0.7. Following parameters were measured: BNP and sodium from blood samples, exercise performance from 6-minute walking test (6MWT, meters) (n = 18), LV end-diastolic diameter (LVEDD) and LV mass (LVM) from 2D-echocardiography (n = 33), as well as LV end-diastolic pressure (LVEDP, n = 23) and systemic vascular resistance (SVR, n = 20) from heart-catheterisation. Ten pts were hospitalised in the preceding 6 months because of worsening CHF or for optimisation of medical therapy. BNP was measured at the beginning and end of the hospital-stay. Follow-up was for 1 year. RESULTS Pts with a high NYHA-class had a higher BNP (pg/ml) than those with a low NYHA- class: NYHA I 51 +/- 20, II 281 +/- 223, III 562+/-346 and IV 1061 +/- 126 pg/ml (p = 0.002). BNP correlated with LVEDP (r = 0.50, p <0.02), SVR (r =0.49, p <0.03) and inversely with 6MWT (r =-0.60, p <0.009), LVEF (r = -0.49, p <0.004) and sodium (r = -0.36, p = 0.04). In the hospitalised pts, mean BNP (pg/ml) was 881 +/- 695 at admission,and 532 +/- 435 at discharge (n.s.). Decrease in BNPduring hospitalisation paralleled weight-loss and was significantly greater in patients with >1000 pg/ml BNP at admission (n = 5) as compared to the 5 patients with BNP <1000 (p <0.03). Patients with an adverse event during 1-year follow-up had significantly higher BNP both at steady-state (603 +/-359 pg/ml) and at time of decompensation than patients with a favourable outcome (227 +/- 218 pg/ml,p <0.001). CONCLUSIONS BNP correlates well with the clinical severity of CHF (NYHA-class) and is directly related to filling pressure (LVEDP), LV function(LVEF) and exercise performance (6 MWT). Furthermore, BNP has prognostic impact with regard to adverse clinical events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to compare the relative effectiveness of alternative methods of tracing named contacts of syphilis patients. A total of 236 contacts, identified by patients in two City of Houston Department of Health and Human Services clinics during the period April 1 through July 31, 1987, were studied. After contacts were grouped by sex and age, the proportion brought to examination by each of three methods, and by a combination of methods, was determined for each subgroup.^ The study found that 78.4% of all the 236 named sex contacts reported were located and brought to examination by the various methods of contact tracing and that 21.6% were missed. Of the 185 contacts examined, a combination of methods identified 47.7% of the cases, telephone contact, 28.6%, field contact, 16.9%, and patient referral, 11.8%.^ Of the 236 contacts reported, males made up 56.8% and females 43.2%. Contact tracing was more successful among females, with 81.4% of the 102 named female contacts, as compared to 76.1% of the 134 named male contacts being brought to examination. It is not known whether equal efforts were exerted in the follow-up of both male and female contacts. In both female and male subgroups, a combination of methods brought over 40% of sex contacts to examination. Telephone contact among females yielded 27.7% of the cases and field contact 18.1%, whereas in males, telephone contact identified 29.4% of the cases and field contact 15.7%. Patient referral was the least productive method in both sex groups, locating 12.8% in males as compared to 10.8% in females.^ On an age specific basis, a combination of methods was the most effective method in the 15-39 age group, whereas telephone contact was most effective in the 40-44 age group, and field contact in the 50-54 age group. Of all the methods of contact tracing, patient referral was the least productive in most age groups.^ Future studies of contact tracing should incorporate several important variables which were not examined in this study. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is increasing pressure on developers to produce usable systems, which requires the use of appropriate methods to support user centred design during development. There is currently no consistent advice on which methods are appropriate in which circumstances, so the selection of methods relies on individual experience and expertise. Considerable effort is required to collate information from various sources and to understand the applicability of each method in a particular situation. Usability Planner is a tool aimed to support the selection of the most appropriate methods depending on project and organizational constraints. Many of the rules employed are derived from ISO standards, complemented with rules from the authors’ experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patent and trademark offices which run according to principles of new management have an inherent need for dependable forecasting data in planning capacity and service levels. The ability of the Spanish Office of Patents and Trademarks to carry out efficient planning of its resource needs requires the use of methods which allow it to predict the changes in the number of patent and trademark applications at different time horizons. The approach for the prediction of time series of Spanish patents and trademarks applications (1979e2009) was based on the use of different techniques of time series prediction in a short-term horizon. The methods used can be grouped into two specifics areas: regression models of trends and time series models. The results of this study show that it is possible to model the series of patents and trademarks applications with different models, especially ARIMA, with satisfactory model adjustment and relatively low error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current model-driven Web Engineering approaches (such as OO-H, UWE or WebML) provide a set of methods and supporting tools for a systematic design and development of Web applications. Each method addresses different concerns using separate models (content, navigation, presentation, business logic, etc.), and provide model compilers that produce most of the logic and Web pages of the application from these models. However, these proposals also have some limitations, especially for exchanging models or representing further modeling concerns, such as architectural styles, technology independence, or distribution. A possible solution to these issues is provided by making model-driven Web Engineering proposals interoperate, being able to complement each other, and to exchange models between the different tools. MDWEnet is a recent initiative started by a small group of researchers working on model-driven Web Engineering (MDWE). Its goal is to improve current practices and tools for the model-driven development of Web applications for better interoperability. The proposal is based on the strengths of current model-driven Web Engineering methods, and the existing experience and knowledge in the field. This paper presents the background, motivation, scope, and objectives of MDWEnet. Furthermore, it reports on the MDWEnet results and achievements so far, and its future plan of actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The master thesis presents methods for intellectual analysis and visualization 3D EKG in order to increase the efficiency of ECG analysis by extracting additional data. Visualization is presented as part of the signal analysis tasks considered imaging techniques and their mathematical description. Have been developed algorithms for calculating and visualizing the signal attributes are described using mathematical methods and tools for mining signal. The model of patterns searching for comparison purposes of accuracy of methods was constructed, problems of a clustering and classification of data are solved, the program of visualization of data is also developed. This approach gives the largest accuracy in a task of the intellectual analysis that is confirmed in this work. Considered visualization and analysis techniques are also applicable to the multi-dimensional signals of a different kind.