888 resultados para Web modelling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Forschungsarbeit siedelt sich im Dreieck der Erziehungswissenschaften, der Informatik und der Schulpraxis an und besitzt somit einen starken interdisziplinären Charakter. Aus Sicht der Erziehungswissenschaften handelt es sich um ein Forschungsprojekt aus den Bereichen E-Learning und Multimedia Learning und der Fragestellung nach geeigneten Informatiksystemen für die Herstellung und den Austausch von digitalen, multimedialen und interaktiven Lernbausteinen. Dazu wurden zunächst methodisch-didaktische Vorteile digitaler Lerninhalte gegenüber klassischen Medien wie Buch und Papier zusammengetragen und mögliche Potentiale im Zusammenhang mit neuen Web2.0-Technologien aufgezeigt. Darauf aufbauend wurde für existierende Autorenwerkzeuge zur Herstellung digitaler Lernbausteine und bestehende Austauschplattformen analysiert, inwieweit diese bereits Web 2.0-Technologien unterstützen und nutzen. Aus Sicht der Informatik ergab sich aus der Analyse bestehender Systeme ein Anforderungsprofil für ein neues Autorenwerkzeug und eine neue Austauschplattform für digitale Lernbausteine. Das neue System wurde nach dem Ansatz des Design Science Research in einem iterativen Entwicklungsprozess in Form der Webapplikation LearningApps.org realisiert und stetig mit Lehrpersonen aus der Schulpraxis evaluiert. Bei der Entwicklung kamen aktuelle Web-Technologien zur Anwendung. Das Ergebnis der Forschungsarbeit ist ein produktives Informatiksystem, welches bereits von tausenden Nutzern in verschiedenen Ländern sowohl in Schulen als auch in der Wirtschaft eingesetzt wird. In einer empirischen Studie konnte das mit der Systementwicklung angestrebte Ziel, die Herstellung und den Austausch von digitalen Lernbausteinen zu vereinfachen, bestätigt werden. Aus Sicht der Schulpraxis liefert LearningApps.org einen Beitrag zur Methodenvielfalt und zur Nutzung von ICT im Unterricht. Die Ausrichtung des Werkzeugs auf mobile Endgeräte und 1:1-Computing entspricht dem allgemeinen Trend im Bildungswesen. Durch die Verknüpfung des Werkzeugs mit aktuellen Software Entwicklungen zur Herstellung von digitalen Schulbüchern werden auch Lehrmittelverlage als Zielgruppe angesprochen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ein Tag ohne Internet ist für viele kaum vorstellbar. Das Spektrum der Internetnutzer ist breiter geworden und damit sind die Ansprüche an die Websites massiv angestiegen. Die Entscheidung auf einer Website zu bleiben oder auf einer anderen zu suchen fällt innerhalb von wenigen Sekunden. Diese Entscheidung ist sowohl vom Website-Design als auch von dem dargestellten Inhalt abhängig. Die Auswertung, wie schnell der Benutzer Online-Informationen finden und wie einfach er sie verstehen kann, ist die Aufgabe von Web-Usability-Testing. Für das Finden und Verstehen von Informationen sind die computertechnischen zusammen mit den linguistischen Aspekten zuständig. In der Usability-Forschung liegt jedoch der Fokus bislang weitgehend auf der Bewertung der computer¬linguistischen und ästhetischen Aspekte der Websites. In den Hintergrund gedrängt wurden dabei die linguistischen Aspekte. Im Vergleich sind diese weniger systematisch erforscht und in Usability-Richtlinien kaum zu finden. Stattdessen stößt man überwiegend auf allgemeine Empfehlungen. Motiviert davon hat die vorliegende Arbeit das Ziel, Die Web-Usability systematisch sowohl aus linguistischer als auch aus formaler Sicht zu erforschen. Auf linguistischer Ebene wurde in Anlehnung an die Zeichentheorie von Morris die Web-Usability analysiert und der Begriff Linguistische Web-Usability eingeführt. Auf Basis dieser Analyse sowie einer literaturstudie ‘literature review’ mehrerer Usability-Richtlinien wurde ein Kriterienkatalog entwickelt. Bei der Umsetzung dieses Kriterienkatalogs im Rahmen einer Usability-Studie wurde die Website der Universität Johannes Gutenberg-Universität Mainz (JGU) im Usability-Labor unter Anwendung der Methode Eye-Tracking zusammen mit der Think-Aloud-Methode und der Retrospective-Think-Aloud-Methode getestet. Die empirischen Ergebnisse zeigen, dass die linguistischen Usability-Probleme genau wie die formalen die Benutzer hindern, die gesuchten Informationen zu finden, oder zumindest ihre Suche verlangsamen. Dementsprechend sollten die linguistischen Perspektiven in die Usability-Richtlinien miteinbezogen werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, layered manufacturing (LM) processes have begun to progress from rapid prototyping techniques towards rapid manufacturing methods, where the objective is now to produce finished components for potential end use in a product (Caulfield et al., 2007). LM is especially promising for the fabrication of specific need, low volume products such as replacement parts for larger systems. This trend accentuates the need for a thorough understanding of the associated mechanical properties and the resulting behavior of parts produced by layered methods. Not only must the base material be durable, but the mechanical properties of the layered components must be sufficient to meet in-service loading and operational requirements, and be reasonably comparable to parts produced by more traditional manufacturing techniques. This chapter presents the details of a study completed to quantitatively analyze the potential of fused deposition modelling to fully evolve into a rapid manufacturing tool. The project objective is to develop an understanding of the dependence of the mechanical properties of FDM parts on raster orientation and to assess whether these parts are capable of maintaining their integrity while under service loading. The study examines the effect of fiber orientation, i.e. the direction of the polymer beads relative to the loading direction of the part, on a variety of important mechanical properties of ABS components fabricated by fused deposition modeling. Tensile, compressive, flexural, impact, and fatigue strength properties of FDM specimens are examined, evaluated, and placed in context in comparison with the properties of injection molded ABS parts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bite mark analysis offers the opportunity to identify the biter based on the individual characteristics of the dentitions. Normally, the main focus is on analysing bite mark injuries on human bodies, but also, bite marks in food may play an important role in the forensic investigation of a crime. This study presents a comparison of simulated bite marks in different kinds of food with the dentitions of the presumed biter. Bite marks were produced by six adults in slices of buttered bread, apples, different kinds of Swiss chocolate and Swiss cheese. The time-lapse influence of the bite mark in food, under room temperature conditions, was also examined. For the documentation of the bite marks and the dentitions of the biters, 3D optical surface scanning technology was used. The comparison was performed using two different software packages: the ATOS modelling and analysing software and the 3D studio max animation software. The ATOS software enables an automatic computation of the deviation between the two meshes. In the present study, the bite marks and the dentitions were compared, as well as the meshes of each bite mark which were recorded in the different stages of time lapse. In the 3D studio max software, the act of biting was animated to compare the dentitions with the bite mark. The examined food recorded the individual characteristics of the dentitions very well. In all cases, the biter could be identified, and the dentitions of the other presumed biters could be excluded. The influence of the time lapse on the food depends on the kind of food and is shown on the diagrams. However, the identification of the biter could still be performed after a period of time, based on the recorded individual characteristics of the dentitions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Pelvic inflammatory disease (PID) results from the ascending spread of microorganisms from the vagina and endocervix to the upper genital tract. PID can lead to infertility, ectopic pregnancy and chronic pelvic pain. The timing of development of PID after the sexually transmitted bacterial infection Chlamydia trachomatis (chlamydia) might affect the impact of screening interventions, but is currently unknown. This study investigates three hypothetical processes for the timing of progression: at the start, at the end, or throughout the duration of chlamydia infection. Methods We develop a compartmental model that describes the trial structure of a published randomised controlled trial (RCT) and allows each of the three processes to be examined using the same model structure. The RCT estimated the effect of a single chlamydia screening test on the cumulative incidence of PID up to one year later. The fraction of chlamydia infected women who progress to PID is obtained for each hypothetical process by the maximum likelihood method using the results of the RCT. Results The predicted cumulative incidence of PID cases from all causes after one year depends on the fraction of chlamydia infected women that progresses to PID and on the type of progression. Progression at a constant rate from a chlamydia infection to PID or at the end of the infection was compatible with the findings of the RCT. The corresponding estimated fraction of chlamydia infected women that develops PID is 10% (95% confidence interval 7-13%) in both processes. Conclusions The findings of this study suggest that clinical PID can occur throughout the course of a chlamydia infection, which will leave a window of opportunity for screening to prevent PID.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Open web steel joists are designed in the United States following the governing specification published by the Steel Joist Institute. For compression members in joists, this specification employs an effective length factor, or K-factor, in confirming their adequacy. In most cases, these K-factors have been conservatively assumed equal to 1.0 for compression web members, regardless of the fact that intuition and limited experimental work indicate that smaller values could be justified. Given that smaller K-factors could result in more economical designs without a loss in safety, the research presented in this thesis aims to suggest procedures for obtaining more rational values. Three different methods for computing in-plane and out-of-plane K-factors are investigated, including (1) a hand calculation method based on the use of alignment charts, (2) computational critical load (eigenvalue) analyses using uniformly distributed loads, and (3) computational analyses using a compressive strain approach. The latter method is novel and allows for computing the individual buckling load of a specific member within a system, such as a joist. Four different joist configurations are investigated, including an 18K3, 28K10, and two variations of a 32LH06. Based on these methods and the very limited number of joists studied, it appears promising that in-plane and out-of-plane K-factors of 0.75 and 0.85, respectively, could be used in computing the flexural buckling strength of web members in routine steel joist design. Recommendations for future work, which include systematically investigating a wider range of joist configurations and connection restraint, are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To review systematically and critically, evidence used to derive estimates of costs and cost effectiveness of chlamydia screening. METHODS: Systematic review. A search of 11 electronic bibliographic databases from the earliest date available to August 2004 using keywords including chlamydia, pelvic inflammatory disease, economic evaluation, and cost. We included studies of chlamydia screening in males and/or females over 14 years, including studies of diagnostic tests, contact tracing, and treatment as part of a screening programme. Outcomes included cases of chlamydia identified and major outcomes averted. We assessed methodological quality and the modelling approach used. RESULTS: Of 713 identified papers we included 57 formal economic evaluations and two cost studies. Most studies found chlamydia screening to be cost effective, partner notification to be an effective adjunct, and testing with nucleic acid amplification tests, and treatment with azithromycin to be cost effective. Methodological problems limited the validity of these findings: most studies used static models that are inappropriate for infectious diseases; restricted outcomes were used as a basis for policy recommendations; and high estimates of the probability of chlamydia associated complications might have overestimated cost effectiveness. Two high quality dynamic modelling studies found opportunistic screening to be cost effective but poor reporting or uncertainty about complication rates make interpretation difficult. CONCLUSION: The inappropriate use of static models to study interventions to prevent a communicable disease means that uncertainty remains about whether chlamydia screening programmes are cost effective or not. The results of this review can be used by health service managers in the allocation of resources, and health economists and other researchers who are considering further research in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a state-of-the-art application of smoothing for dependent bivariate binomial spatial data to Loa loa prevalence mapping in West Africa. This application is special because it starts with the non-spatial calibration of survey instruments, continues with the spatial model building and assessment and ends with robust, tested software that will be used by the field scientists of the World Health Organization for online prevalence map updating. From a statistical perspective several important methodological issues were addressed: (a) building spatial models that are complex enough to capture the structure of the data but remain computationally usable; (b)reducing the computational burden in the handling of very large covariate data sets; (c) devising methods for comparing spatial prediction methods for a given exceedance policy threshold.