888 resultados para Web modelling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Most available pharmacotherapies for alcohol-dependent patients target abstinence; however, reduced alcohol consumption may be a more realistic goal. Using randomized clinical trial (RCT) data, a previous microsimulation model evaluated the clinical relevance of reduced consumption in terms of avoided alcohol-attributable events. Using real-life observational data, the current analysis aimed to adapt the model and confirm previous findings about the clinical relevance of reduced alcohol consumption. METHODS: Based on the prospective observational CONTROL study, evaluating daily alcohol consumption among alcohol-dependent patients, the model predicted the probability of drinking any alcohol during a given day. Predicted daily alcohol consumption was simulated in a hypothetical sample of 200,000 patients observed over a year. Individual total alcohol consumption (TAC) and number of heavy drinking days (HDD) were derived. Using published risk equations, probabilities of alcohol-attributable adverse health events (e.g., hospitalizations or death) corresponding to simulated consumptions were computed, and aggregated for categories of patients defined by HDDs and TAC (expressed per 100,000 patient-years). Sensitivity analyses tested model robustness. RESULTS: Shifting from >220 HDDs per year to 120-140 HDDs and shifting from 36,000-39,000 g TAC per year (120-130 g/day) to 15,000-18,000 g TAC per year (50-60 g/day) impacted substantially on the incidence of events (14,588 and 6148 events avoided per 100,000 patient-years, respectively). Results were robust to sensitivity analyses. CONCLUSIONS: This study corroborates the previous microsimulation modeling approach and, using real-life data, confirms RCT-based findings that reduced alcohol consumption is a relevant objective for consideration in alcohol dependence management to improve public health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To quantify the relation between body mass index (BMI) and endometrial cancer risk, and to describe the shape of such a relation. DESIGN: Pooled analysis of three hospital-based case-control studies. SETTING: Italy and Switzerland. POPULATION: A total of 1449 women with endometrial cancer and 3811 controls. METHODS: Multivariate odds ratios (OR) and 95% confidence intervals (95% CI) were obtained from logistic regression models. The shape of the relation was determined using a class of flexible regression models. MAIN OUTCOME MEASURE: The relation of BMI with endometrial cancer. RESULTS: Compared with women with BMI 18.5 to <25 kg/m(2) , the odds ratio was 5.73 (95% CI 4.28-7.68) for women with a BMI ≥35 kg/m(2) . The odds ratios were 1.10 (95% CI 1.09-1.12) and 1.63 (95% CI 1.52-1.75) respectively for an increment of BMI of 1 and 5 units. The relation was stronger in never-users of oral contraceptives (OR 3.35, 95% CI 2.78-4.03, for BMI ≥30 versus <25 kg/m(2) ) than in users (OR 1.22, 95% CI 0.56-2.67), and in women with diabetes (OR 8.10, 95% CI 4.10-16.01, for BMI ≥30 versus <25 kg/m(2) ) than in those without diabetes (OR 2.95, 95% CI 2.44-3.56). The relation was best fitted by a cubic model, although after the exclusion of the 5% upper and lower tails, it was best fitted by a linear model. CONCLUSIONS: The results of this study confirm a role of elevated BMI in the aetiology of endometrial cancer and suggest that the risk in obese women increases in a cubic nonlinear fashion. The relation was stronger in never-users of oral contraceptives and in women with diabetes. TWEETABLE ABSTRACT: Risk of endometrial cancer increases with elevated body weight in a cubic nonlinear fashion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Landslide processes can have direct and indirect consequences affecting human lives and activities. In order to improve landslide risk management procedures, this PhD thesis aims to investigate capabilities of active LiDAR and RaDAR sensors for landslides detection and characterization at regional scales, spatial risk assessment over large areas and slope instabilities monitoring and modelling at site-specific scales. At regional scales, we first demonstrated recent boat-based mobile LiDAR capabilities to model topography of the Normand coastal cliffs. By comparing annual acquisitions, we validated as well our approach to detect surface changes and thus map rock collapses, landslides and toe erosions affecting the shoreline at a county scale. Then, we applied a spaceborne InSAR approach to detect large slope instabilities in Argentina. Based on both phase and amplitude RaDAR signals, we extracted decisive information to detect, characterize and monitor two unknown extremely slow landslides, and to quantify water level variations of an involved close dam reservoir. Finally, advanced investigations on fragmental rockfall risk assessment were conducted along roads of the Val de Bagnes, by improving approaches of the Slope Angle Distribution and the FlowR software. Therefore, both rock-mass-failure susceptibilities and relative frequencies of block propagations were assessed and rockfall hazard and risk maps could be established at the valley scale. At slope-specific scales, in the Swiss Alps, we first integrated ground-based InSAR and terrestrial LiDAR acquisitions to map, monitor and model the Perraire rock slope deformation. By interpreting both methods individually and originally integrated as well, we therefore delimited the rockslide borders, computed volumes and highlighted non-uniform translational displacements along a wedge failure surface. Finally, we studied specific requirements and practical issues experimented on early warning systems of some of the most studied landslides worldwide. As a result, we highlighted valuable key recommendations to design new reliable systems; in addition, we also underlined conceptual issues that must be solved to improve current procedures. To sum up, the diversity of experimented situations brought an extensive experience that revealed the potential and limitations of both methods and highlighted as well the necessity of their complementary and integrated uses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rosin is a natural product from pine forests and it is used as a raw material in resinate syntheses. Resinates are polyvalent metal salts of rosin acids and especially Ca- and Ca/Mg- resinates find wide application in the printing ink industry. In this thesis, analytical methods were applied to increase general knowledge of resinate chemistry and the reaction kinetics was studied in order to model the non linear solution viscosity increase during resinate syntheses by the fusion method. Solution viscosity in toluene is an important quality factor for resinates to be used in printing inks. The concept of critical resinate concentration, c crit, was introduced to define an abrupt change in viscosity dependence on resinate concentration in the solution. The concept was then used to explain the non-inear solution viscosity increase during resinate syntheses. A semi empirical model with two estimated parameters was derived for the viscosity increase on the basis of apparent reaction kinetics. The model was used to control the viscosity and to predict the total reaction time of the resinate process. The kinetic data from the complex reaction media was obtained by acid value titration and by FTIR spectroscopic analyses using a conventional calibration method to measure the resinate concentration and the concentration of free rosin acids. A multivariate calibration method was successfully applied to make partial least square (PLS) models for monitoring acid value and solution viscosity in both mid-infrared (MIR) and near infrared (NIR) regions during the syntheses. The calibration models can be used for on line resinate process monitoring. In kinetic studies, two main reaction steps were observed during the syntheses. First a fast irreversible resination reaction occurs at 235 °C and then a slow thermal decarboxylation of rosin acids starts to take place at 265 °C. Rosin oil is formed during the decarboxylation reaction step causing significant mass loss as the rosin oil evaporates from the system while the viscosity increases to the target level. The mass balance of the syntheses was determined based on the resinate concentration increase during the decarboxylation reaction step. A mechanistic study of the decarboxylation reaction was based on the observation that resinate molecules are partly solvated by rosin acids during the syntheses. Different decarboxylation mechanisms were proposed for the free and solvating rosin acids. The deduced kinetic model supported the analytical data of the syntheses in a wide resinate concentration region, over a wide range of viscosity values and at different reaction temperatures. In addition, the application of the kinetic model to the modified resinate syntheses gave a good fit. A novel synthesis method with the addition of decarboxylated rosin (i.e. rosin oil) to the reaction mixture was introduced. The conversion of rosin acid to resinate was increased to the level necessary to obtain the target viscosity for the product at 235 °C. Due to a lower reaction temperature than in traditional fusion synthesis at 265 °C, thermal decarboxylation is avoided. As a consequence, the mass yield of the resinate syntheses can be increased from ca. 70% to almost 100% by recycling the added rosin oil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new numerical program able to model syntectonic sedimentation. The new model combines a discrete element model of the tectonic deformation of a sedimentary cover and a process-based model of sedimentation in a single framework. The integration of these two methods allows us to include the simulation of both sedimentation and deformation processes in a single and more effective model. The paper describes briefly the antecedents of the program, Simsafadim-Clastic and a discrete element model, in order to introduce the methodology used to merge both programs to create the new code. To illustrate the operation and application of the program, analysis of the evolution of syntectonic geometries in an extensional environment and also associated with thrust fault propagation is undertaken. Using the new code, much more complex and realistic depositional structures can be simulated together with a more complex analysis of the evolution of the deformation within the sedimentary cover, which is seen to be affected by the presence of the new syntectonic sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Encouraging office workers to ‘sit less and move more’ encompasses two public health priorities. However, there is little evidence on the effectiveness of workplace interventions for reducing sitting, even less about the longer term effects of such interventions and still less on dual-focused interventions. This study assessed the short and mid-term impacts of a workplace web-based intervention (Walk@WorkSpain, W@WS; 2010-11) on self-reported sitting time, step counts and physical risk factors (waist circumference, BMI, blood pressure) for chronic disease. Methods Employees at six Spanish university campuses (n=264; 42±10 years; 171 female) were randomly assigned by worksite and campus to an Intervention (used W@WS; n=129; 87 female) or a Comparison group (maintained normal behavior; n=135; 84 female). This phased, 19-week program aimed to decrease occupational sitting time through increased incidental movement and short walks. A linear mixed model assessed changes in outcome measures between the baseline, ramping (8 weeks), maintenance (11 weeks) and followup (two months) phases for Intervention versus Comparison groups.A significant 2 (group) × 2 (program phases) interaction was found for self-reported occupational sitting (F[3]=7.97, p=0.046), daily step counts (F[3]=15.68, p=0.0013) and waist circumference (F[3]=11.67, p=0.0086). The Intervention group decreased minutes of daily occupational sitting while also increasing step counts from baseline (446±126; 8,862±2,475) through ramping (+425±120; 9,345±2,435), maintenance (+422±123; 9,638±3,131) and follow-up (+414±129; 9,786±3,205). In the Comparison group, compared to baseline (404±106), sitting time remained unchanged through ramping and maintenance, but decreased at follow-up (-388±120), while step counts diminished across all phases. The Intervention group significantly reduced waist circumference by 2.1cms from baseline to follow-up while the Comparison group reduced waist circumference by 1.3cms over the same period. Conclusions W@WSis a feasible and effective evidence-based intervention that can be successfully deployed with sedentary employees to elicit sustained changes on “sitting less and moving more”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project addresses methodological and technological challenges in the development of multi-modal data acquisition and analysis methods for the representation of instrumental playing technique in music performance through auditory-motor patterning models. The case study is violin playing: a multi-modal database of violin performances has been constructed by recording different musicians while playing short exercises on different violins. The exercise set and recording protocol have been designed to sample the space defined by dynamics (from piano to forte) and tone (from sul tasto to sul ponticello), for each bow stroke type being played on each of the four strings (three different pitches per string) at two different tempi. The data, containing audio, video, and motion capture streams, has been processed and segmented to facilitate upcoming analyses. From the acquired motion data, the positions of the instrument string ends and the bow hair ribbon ends are tracked and processed to obtain a number of bowing descriptors suited for a detailed description and analysis of the bow motion patterns taking place during performance. Likewise, a number of sound perceptual attributes are computed from the audio streams. Besides the methodology and the implementation of a number of data acquisition tools, this project introduces preliminary results from analyzing bowing technique on a multi-modal violin performance database that is unique in its class. A further contribution of this project is the data itself, which will be made available to the scientific community through the repovizz platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Käyttöliittymä on rajapinta käyttäjän ja järjestelmän tarjoamien toimintojen välillä ja sen toimivuus vaikuttaa toimintojen suorittamiseen joko positiivisesti tai negatiivisesti. Täten sovelluksen suunnitteluvaiheessa on hyvä arvioida käyttöliittymän ja sen toimintojen laatua ja kokeilla ideoiden toimivuutta rakentamalla asiasta prototyyppejä. Prototypoinnilla voidaan tunnistaa ja korjata mahdolliset ongelmat jo suunnittelupöydällä. Tämä diplomityö käsittelee Web-sovelluksen kehityksen aikana toteutettua käyttöliittymän ja sen toimintojen prototypointia. Käyttöliittymien mallintamista voidaan toteuttaa erilaisilla menetelmillä, joita työssä käydään läpi teknologisista näkökulmista eli miten prototypointimenetelmiä voidaan soveltaa projektin eri vaiheissa. Prototypoinnin apuna käytettäviin työkaluihin luodaan lyhyt katsaus esitellen yleisellä tasolla muutamia eri sovelluskategorian ohjelmistoja ja lisäksi käsitellään suunnittelumallien hyödyntämistä. Työ osoittaa, että yleisiä prototypointimenetelmiä ja -periaatteita voidaan soveltaa Web-sovellusten prototypoinnissa. Prototypointi on hyödyllistä aloittaa luonnostelemalla ja jatkaa aikaisessa vaiheessa HTML-malleihin, joilla päästään lähelle toteutuksen teknologioita ja mallintamaan sovelluksen luonnetta, ilmettä, tuntumaa ja vuorovaikutusta. HTML-prototyypeistä voidaan jalostaa sekoitetun tarkkuuden malleja ja ne toimivat toteutuksen perustana. Jatkokehityksessä ideoita voidaan esittää useilla eri tarkkuuden tekniikoilla.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents models and methods that have been used in producing forecasts of population growth. The work is intended to emphasize the reliability bounds of the model forecasts. Leslie model and various versions of logistic population models are presented. References to literature and several studies are given. A lot of relevant methodology has been developed in biological sciences. The Leslie modelling approach involves the use of current trends in mortality,fertility, migration and emigration. The model treats population divided in age groups and the model is given as a recursive system. Other group of models is based on straightforward extrapolation of census data. Trajectories of simple exponential growth function and logistic models are used to produce the forecast. The work presents the basics of Leslie type modelling and the logistic models, including multi- parameter logistic functions. The latter model is also analysed from model reliability point of view. Bayesian approach and MCMC method are used to create error bounds of the model predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research focus of this thesis is to explore options for building systems for business critical web applications. Business criticality here includes requirements for data protection and system availability. The focus is on open source software. Goals are to identify robust technologies and engineering practices to implement such systems. Research methods include experiments made with sample systems built around chosen software packages that represent certain technologies. The main research focused on finding a good method for database data replication, a key functionality for high-availability, database-driven web applications. Research included also finding engineering best practices from books written by administrators of high traffic web applications. Experiment with database replication showed, that block level synchronous replication offered by DRBD replication software offered considerably more robust data protection and high-availability functionality compared to leading open source database product MySQL, and its built-in asynchronous replication. For master-master database setups, block level replication is more recommended way to build high-availability into the system. Based on thesis research, building high-availability web applications is possible using a combination of open source software and engineering best practices for data protection, availability planning and scaling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this dissertation is to improve the dynamic simulation of fluid power circuits. A fluid power circuit is a typical way to implement power transmission in mobile working machines, e.g. cranes, excavators etc. Dynamic simulation is an essential tool in developing controllability and energy-efficient solutions for mobile machines. Efficient dynamic simulation is the basic requirement for the real-time simulation. In the real-time simulation of fluid power circuits there exist numerical problems due to the software and methods used for modelling and integration. A simulation model of a fluid power circuit is typically created using differential and algebraic equations. Efficient numerical methods are required since differential equations must be solved in real time. Unfortunately, simulation software packages offer only a limited selection of numerical solvers. Numerical problems cause noise to the results, which in many cases leads the simulation run to fail. Mathematically the fluid power circuit models are stiff systems of ordinary differential equations. Numerical solution of the stiff systems can be improved by two alternative approaches. The first is to develop numerical solvers suitable for solving stiff systems. The second is to decrease the model stiffness itself by introducing models and algorithms that either decrease the highest eigenvalues or neglect them by introducing steady-state solutions of the stiff parts of the models. The thesis proposes novel methods using the latter approach. The study aims to develop practical methods usable in dynamic simulation of fluid power circuits using explicit fixed-step integration algorithms. In this thesis, twomechanisms whichmake the systemstiff are studied. These are the pressure drop approaching zero in the turbulent orifice model and the volume approaching zero in the equation of pressure build-up. These are the critical areas to which alternative methods for modelling and numerical simulation are proposed. Generally, in hydraulic power transmission systems the orifice flow is clearly in the turbulent area. The flow becomes laminar as the pressure drop over the orifice approaches zero only in rare situations. These are e.g. when a valve is closed, or an actuator is driven against an end stopper, or external force makes actuator to switch its direction during operation. This means that in terms of accuracy, the description of laminar flow is not necessary. But, unfortunately, when a purely turbulent description of the orifice is used, numerical problems occur when the pressure drop comes close to zero since the first derivative of flow with respect to the pressure drop approaches infinity when the pressure drop approaches zero. Furthermore, the second derivative becomes discontinuous, which causes numerical noise and an infinitely small integration step when a variable step integrator is used. A numerically efficient model for the orifice flow is proposed using a cubic spline function to describe the flow in the laminar and transition areas. Parameters for the cubic spline function are selected such that its first derivative is equal to the first derivative of the pure turbulent orifice flow model in the boundary condition. In the dynamic simulation of fluid power circuits, a tradeoff exists between accuracy and calculation speed. This investigation is made for the two-regime flow orifice model. Especially inside of many types of valves, as well as between them, there exist very small volumes. The integration of pressures in small fluid volumes causes numerical problems in fluid power circuit simulation. Particularly in realtime simulation, these numerical problems are a great weakness. The system stiffness approaches infinity as the fluid volume approaches zero. If fixed step explicit algorithms for solving ordinary differential equations (ODE) are used, the system stability would easily be lost when integrating pressures in small volumes. To solve the problem caused by small fluid volumes, a pseudo-dynamic solver is proposed. Instead of integration of the pressure in a small volume, the pressure is solved as a steady-state pressure created in a separate cascade loop by numerical integration. The hydraulic capacitance V/Be of the parts of the circuit whose pressures are solved by the pseudo-dynamic method should be orders of magnitude smaller than that of those partswhose pressures are integrated. The key advantage of this novel method is that the numerical problems caused by the small volumes are completely avoided. Also, the method is freely applicable regardless of the integration routine applied. The superiority of both above-mentioned methods is that they are suited for use together with the semi-empirical modelling method which necessarily does not require any geometrical data of the valves and actuators to be modelled. In this modelling method, most of the needed component information can be taken from the manufacturer’s nominal graphs. This thesis introduces the methods and shows several numerical examples to demonstrate how the proposed methods improve the dynamic simulation of various hydraulic circuits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Search engine optimization & marketing is a set of processes widely used on websites to improve search engine rankings which generate quality web traffic and increase ROI. Content is the most important part of any website. CMS web development is now become very essential for most of organizations and online businesses to develop their online system and websites. Every online business using a CMS wants to get users (customers) to make profit and ROI. This thesis comprises a brief study of existing SEO methods, tools and techniques and how they can be implemented to optimize a content base website. In results, the study provides recommendations about how to use SEO methods; tools and techniques to optimize CMS based websites on major search engines. This study compares popular CMS systems like Drupal, WordPress and Joomla SEO features and how implementing SEO can be improved on these CMS systems. Having knowledge of search engine indexing and search engine working is essential for a successful SEO campaign. This work is a complete guideline for web developers or SEO experts who want to optimize a CMS based website on all major search engines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The state of Ceará, Brazil, has 75% of its area covered by Brazilian semiarid, with its peculiar features. In this state, the dams are constituted in water structure of strategic importance, ensuring, both in time and space, the development and supply of water to population. However, construction of reservoirs results in various impacts that should be carefully observed when deciding on their implementation. One of the impacts identified as negative is the increased evaporation, which constitutes a major component of water balance in reservoirs, especially in arid regions. Several methods for estimating evaporation have been proposed over time, many of them deriving from the Penman equation. This study evaluated six different methods for estimating evaporation in order to determine the most suitable for use in hydrological models for water balance in reservoirs in the state of Ceará. The tested methods were proposed by Penman, Kohler-Nordenson-Fox, Priestley-Taylor, deBruim-Keijman, Brutsaert-Stricker and deBruim. The methods presented good performance when tested for water balance during the dry season, and the Priestley-Taylor was the most appropriate, since the data from de simulated water balance with evaporation estimated by this method were the closest of the water balance data observed from measures of reservoir level and the elevation-volume curve provided by the Company of Management of Water Resources of the state of Ceará - COGERH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scarcity of long-term series of sediment-related variables has led watershed managers to apply mathematical models to simulate sediment fluxes. Due to the high efforts for installation and maintenance of sedimentological gauges, tracers have been pointed out as an alternative to validate soil redistribution modelling. In this study, the 137Cs technique was used to assess the WASA-SED model performance at the Benguê watershed (933 km²), in the Brazilian semiarid. Qualitatively, good agreement was found among the 137Cs technique and the WASA-SED model results. Nonetheless, quantitatively great differences, up to two orders of magnitude, were found between the two methods. Among the uncertainties inherent to the 137Cs technique, definition of the reference inventory seems to be a major source of imprecision. In addition, estimations of water and sediment fluxes with mathematical models usually also present high uncertainty, contributing to the quantitative differences of the soil redistribution estimates with the two methods.