888 resultados para Web modelling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The impact of cancer upon children, teenagers and young people can be profound. Research has been undertaken to explore the impacts upon children, teenagers and young people with cancer, but little is known about how researchers can ‘best’ engage with this group to explore their experiences. This review paper provides an overview of the utility of data collection methods employed when undertaking research with children, teenagers and young people. A systematic review of relevant databases was undertaken utilising the search terms ‘young people’, ‘young adult’, ‘adolescent’ and ‘data collection methods’. The full-text of the papers that were deemed eligible from the title and abstract were accessed and following discussion within the research team, thirty papers were included. Findings: Due to the heterogeneity in terms of the scope of the papers identified the following data collections methods were included in the results section. Three of the papers identified provided an overview of data collection methods utilised with this population and the remaining twenty seven papers covered the following data collection methods: Digital technologies; art based research; comparing the use of ‘paper and pencil’ research with web-based technologies, the use of games; the use of a specific communication tool; questionnaires and interviews; focus groups and telephone interviews/questionnaires. The strengths and limitations of the range of data collection methods included are discussed drawing upon such issues as of the appropriateness of particular methods for particular age groups, or the most appropriate method to employ when exploring a particularly sensitive topic area. Conclusions: There are a number of data collection methods utilised to undertaken research with children, teenagers and young adults. This review provides a summary of the current available evidence and an overview of the strengths and limitations of data collection methods employed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Once the preserve of university academics and research laboratories with high-powered and expensive computers, the power of sophisticated mathematical fire models has now arrived on the desk top of the fire safety engineer. It is a revolution made possible by parallel advances in PC technology and fire modelling software. But while the tools have proliferated, there has not been a corresponding transfer of knowledge and understanding of the discipline from expert to general user. It is a serious shortfall of which the lack of suitable engineering courses dealing with the subject is symptomatic, if not the cause. The computational vehicles to run the models and an understanding of fire dynamics are not enough to exploit these sophisticated tools. Too often, they become 'black boxes' producing magic answers in exciting three-dimensional colour graphics and client-satisfying 'virtual reality' imagery. As well as a fundamental understanding of the physics and chemistry of fire, the fire safety engineer must have at least a rudimentary understanding of the theoretical basis supporting fire models to appreciate their limitations and capabilities. The five day short course, "Principles and Practice of Fire Modelling" run by the University of Greenwich attempt to bridge the divide between the expert and the general user, providing them with the expertise they need to understand the results of mathematical fire modelling. The course and associated text book, "Mathematical Modelling of Fire Phenomena" are aimed at students and professionals with a wide and varied background, they offer a friendly guide through the unfamiliar terrain of mathematical modelling. These concepts and techniques are introduced and demonstrated in seminars. Those attending also gain experience in using the methods during "hands-on" tutorial and workshop sessions. On completion of this short course, those participating should: - be familiar with the concept of zone and field modelling; - be familiar with zone and field model assumptions; - have an understanding of the capabilities and limitations of modelling software packages for zone and field modelling; - be able to select and use the most appropriate mathematical software and demonstrate their use in compartment fire applications; and - be able to interpret model predictions. The result is that the fire safety engineer is empowered to realise the full value of mathematical models to help in the prediction of fire development, and to determine the consequences of fire under a variety of conditions. This in turn enables him or her to design and implement safety measures which can potentially control, or at the very least reduce the impact of fire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a three dimensional, thermos-mechanical modelling approach to the cooling and solidification phases associated with the shape casting of metals ei. Die, sand and investment casting. Novel vortex-based Finite Volume (FV) methods are described and employed with regard to the small strain, non-linear Computational Solid Mechanics (CSM) capabilities required to model shape casting. The CSM capabilities include the non-linear material phenomena of creep and thermo-elasto-visco-plasticity at high temperatures and thermo-elasto-visco-plasticity at low temperatures and also multi body deformable contact with which can occur between the metal casting of the mould. The vortex-based FV methods, which can be readily applied to unstructured meshes, are included within a comprehensive FV modelling framework, PHYSICA. The additional heat transfer, by conduction and convection, filling, porosity and solidification algorithms existing within PHYSICA for the complete modelling of all shape casting process employ cell-centred FV methods. The termo-mechanical coupling is performed in a staggered incremental fashion, which addresses the possible gap formation between the component and the mould, and is ultimately validated against a variety of shape casting benchmarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The difficulties encountered in implementing large scale CM codes on multiprocessor systems are now fairly well understood. Despite the claims of shared memory architecture manufacturers to provide effective parallelizing compilers, these have not proved to be adequate for large or complex programs. Significant programmer effort is usually required to achieve reasonable parallel efficiencies on significant numbers of processors. The paradigm of Single Program Multi Data (SPMD) domain decomposition with message passing, where each processor runs the same code on a subdomain of the problem, communicating through exchange of messages, has for some time been demonstrated to provide the required level of efficiency, scalability, and portability across both shared and distributed memory systems, without the need to re-author the code into a new language or even to support differing message passing implementations. Extension of the methods into three dimensions has been enabled through the engineering of PHYSICA, a framework for supporting 3D, unstructured mesh and continuum mechanics modeling. In PHYSICA, six inspectors are used. Part of the challenge for automation of parallelization is being able to prove the equivalence of inspectors so that they can be merged into as few as possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le processus de planification forestière hiérarchique présentement en place sur les terres publiques risque d’échouer à deux niveaux. Au niveau supérieur, le processus en place ne fournit pas une preuve suffisante de la durabilité du niveau de récolte actuel. À un niveau inférieur, le processus en place n’appuie pas la réalisation du plein potentiel de création de valeur de la ressource forestière, contraignant parfois inutilement la planification à court terme de la récolte. Ces échecs sont attribuables à certaines hypothèses implicites au modèle d’optimisation de la possibilité forestière, ce qui pourrait expliquer pourquoi ce problème n’est pas bien documenté dans la littérature. Nous utilisons la théorie de l’agence pour modéliser le processus de planification forestière hiérarchique sur les terres publiques. Nous développons un cadre de simulation itératif en deux étapes pour estimer l’effet à long terme de l’interaction entre l’État et le consommateur de fibre, nous permettant ainsi d’établir certaines conditions pouvant mener à des ruptures de stock. Nous proposons ensuite une formulation améliorée du modèle d’optimisation de la possibilité forestière. La formulation classique du modèle d’optimisation de la possibilité forestière (c.-à-d., maximisation du rendement soutenu en fibre) ne considère pas que le consommateur de fibre industriel souhaite maximiser son profit, mais suppose plutôt la consommation totale de l’offre de fibre à chaque période, peu importe le potentiel de création de valeur de celle-ci. Nous étendons la formulation classique du modèle d’optimisation de la possibilité forestière afin de permettre l’anticipation du comportement du consommateur de fibre, augmentant ainsi la probabilité que l’offre de fibre soit entièrement consommée, rétablissant ainsi la validité de l’hypothèse de consommation totale de l’offre de fibre implicite au modèle d’optimisation. Nous modélisons la relation principal-agent entre le gouvernement et l’industrie à l’aide d’une formulation biniveau du modèle optimisation, où le niveau supérieur représente le processus de détermination de la possibilité forestière (responsabilité du gouvernement), et le niveau inférieur représente le processus de consommation de la fibre (responsabilité de l’industrie). Nous montrons que la formulation biniveau peux atténuer le risque de ruptures de stock, améliorant ainsi la crédibilité du processus de planification forestière hiérarchique. Ensemble, le modèle biniveau d’optimisation de la possibilité forestière et la méthodologie que nous avons développée pour résoudre celui-ci à l’optimalité, représentent une alternative aux méthodes actuellement utilisées. Notre modèle biniveau et le cadre de simulation itérative représentent un pas vers l’avant en matière de technologie de planification forestière axée sur la création de valeur. L’intégration explicite d’objectifs et de contraintes industrielles au processus de planification forestière, dès la détermination de la possibilité forestière, devrait favoriser une collaboration accrue entre les instances gouvernementales et industrielles, permettant ainsi d’exploiter le plein potentiel de création de valeur de la ressource forestière.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a body size growth model of Northern cod (Gadus morhua) in Northwest Atlantic Fisheries Organization (NAFO) Divisions 2J3KL during 2009-2013. We use individual length-at-age data from the bottom trawl survey in these divisions during 2009–2013. We use the Von Bertalanffy (VonB) model extended to account for between-individual variations in growth, and variations that may be caused by the methods which fish are caught and sampled for length and age measurements. We assume between-individual variation in growth appears because individuals grow at a different rate (k), and they achieve different maximum sizes (l∞). We also included measurement error in length and age in our model since ignoring these errors can lead to biased estimates of the growth parameters. We use the structural errors-invariables (SEV) approach to estimate individual variation in growth, ageing error variation, and the true age distribution of the fish. Our results shows the existence of individual variation in growth and ME in age. According to the negative log likelihood ratio (NLLR) test, the best model indicated: 1) different growth patterns across divisions and years. 2) Between individual variation in growth is the same for the same division across years. 3) The ME in age and true age distribution are different for each year and division.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim The spread of non-indigenous species in marine ecosystems world-wide is one of today's most serious environmental concerns. Using mechanistic modelling, we investigated how global change relates to the invasion of European coasts by a non-native marine invertebrate, the Pacific oyster Crassostrea gigas. Location Bourgneuf Bay on the French Atlantic coast was considered as the northern boundary of C. gigas expansion at the time of its introduction to Europe in the 1970s. From this latitudinal reference, variations in the spatial distribution of the C. gigas reproductive niche were analysed along the north-western European coast from Gibraltar to Norway. Methods The effects of environmental variations on C. gigas physiology and phenology were studied using a bioenergetics model based on Dynamic Energy Budget theory. The model was forced with environmental time series including in situ phytoplankton data, and satellite data of sea surface temperature and suspended particulate matter concentration. Results Simulation outputs were successfully validated against in situ oyster growth data. In Bourgneuf Bay, the rise in seawater temperature and phytoplankton concentration has increased C. gigas reproductive effort and led to precocious spawning periods since the 1960s. At the European scale, seawater temperature increase caused a drastic northward shift (1400 km within 30 years) in the C. gigas reproductive niche and optimal thermal conditions for early life stage development. Main conclusions We demonstrated that the poleward expansion of the invasive species C. gigas is related to global warming and increase in phytoplankton abundance. The combination of mechanistic bioenergetics modelling with in situ and satellite environmental data is a valuable framework for ecosystem studies. It offers a generic approach to analyse historical geographical shifts and to predict the biogeographical changes expected to occur in a climate-changing world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Because there is scientific evidence that an appropriate intake of dietary fibre should be part of a healthy diet, given its importance in promoting health, the present study aimed to develop and validate an instrument to evaluate the knowledge of the general population about dietary fibres. Study design: The present study was a cross sectional study. Methods: The methodological study of psychometric validation was conducted with 6010 participants, residing in ten countries from 3 continents. The instrument is a questionnaire of self-response, aimed at collecting information on knowledge about food fibres. For exploratory factor analysis (EFA) was chosen the analysis of the main components using varimax orthogonal rotation and eigenvalues greater than 1. In confirmatory factor analysis by structural equation modelling (SEM) was considered the covariance matrix and adopted the Maximum Likelihood Estimation algorithm for parameter estimation. Results: Exploratory factor analysis retained two factors. The first was called Dietary Fibre and Promotion of Health (DFPH) and included 7 questions that explained 33.94 % of total variance ( = 0.852). The second was named Sources of Dietary Fibre (SDF) and included 4 questions that explained 22.46% of total variance ( = 0.786). The model was tested by SEM giving a final solution with four questions in each factor. This model showed a very good fit in practically all the indexes considered, except for the ratio 2/df. The values of average variance extracted (0.458 and 0.483) demonstrate the existence of convergent validity; the results also prove the existence of discriminant validity of the factors (r2 = 0.028) and finally good internal consistency was confirmed by the values of composite reliability (0.854 and 0.787). Conclusions: This study allowed validating the KADF scale, increasing the degree of confidence in the information obtained through this instrument in this and in future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Decision-analytic modelling (DAM) has become a widespread method in health technology assessments (HTA), but the extent to which modelling is used differs among international HTA institutions. In Germany, the use of DAM is optional within HTAs of the German Institute of Medical Documentation and Information (DIMDI). Our study examines the use of DAM in DIMDI HTA reports and its effect on the quality of information provided for health policies. METHODS: A review of all DIMDI HTA reports (from 1998 to September 2012) incorporating an economic assessment was performed. All included reports were divided into two groups: HTAs with DAM and HTAs without DAM. In both groups, reports were categorized according to the quality of information provided for healthcare decision making. RESULTS: Of the sample of 107 DIMDI HTA reports, 17 (15.9%) used DAM for economic assessment. In the group without DAM, conclusions were limited by the quality of economic information in 51.1% of the reports, whereas we did not find limited conclusions in the group with DAM. Furthermore, 24 reports without DAM (26.7%) stated that using DAM would likely improve the quality of information of the economic assessment. CONCLUSION: The use of DAM techniques can improve the quality of HTAs in Germany. When, after a systematic review of existing literature within a HTA, it is clear that DAM is likely to positively affect the quality of the economic assessment DAM should be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La visibilidad de una página Web involucra el proceso de mejora de la posición del sitio en los resultados devueltos por motores de búsqueda como Google. Hay muchas empresas que compiten agresivamente para conseguir la primera posición en los motores de búsqueda más populares. Como regla general, los sitios que aparecen más arriba en los resultados suelen obtener más tráfico a sus páginas, y de esta forma, potencialmente más negocios. En este artículo se describe los principales modelos para enriquecer los resultados de las búsquedas con información tales como fechas o localidades; información de tipo clave-valor que permite al usuario interactuar con el contenido de una página Web directamente desde el sitio de resultados de la búsqueda. El aporte fundamental del artículo es mostrar la utilidad de diferentes formatos de marcado para enriquecer fragmentos de una página Web con el fin de ayudar a las empresas que están planeando implementar métodos de enriquecimiento semánticos en la estructuración de sus sitios Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Post-discharge mortality is a frequent but poorly recognized contributor to child mortality in resource limited countries. The identification of children at high risk for post-discharge mortality is a critically important first step in addressing this problem. Objectives: The objective of this project was to determine the variables most likely to be associated with post-discharge mortality which are to be included in a prediction modelling study. Methods: A two-round modified Delphi process was completed for the review of a priori selected variables and selection of new variables. Variables were evaluated on relevance according to (1) prediction (2) availability (3) cost and (4) time required for measurement. Participants included experts in a variety of relevant fields. Results: During the first round of the modified Delphi process, 23 experts evaluated 17 variables. Forty further variables were suggested and were reviewed during the second round by 12 experts. During the second round 16 additional variables were evaluated. Thirty unique variables were compiled for use in the prediction modelling study. Conclusion: A systematic approach was utilized to generate an optimal list of candidate predictor variables for the incorporation into a study on prediction of pediatric post-discharge mortality in a resource poor setting.