63 resultados para Effects-Based Approach to Operations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous anthropological investigations at Trentholme Drive, in Roman York identified an unusual amount of cranial variation amongst the inhabitants, with some individuals suggested as having originated from the Middle East or North Africa. The current study investigates the validity of this assessment using modern anthropological methods to assess cranial variation in two groups: The Railway and Trentholme Drive. Strontium and oxygen isotope evidence derived from the dentition of 43 of these individuals was combined with the craniometric data to provide information on possible levels of migration and the range of homelands that may be represented. The results of the craniometric analysis indicated that the majority of the York population had European origins, but that 11% of the Trentholme Drive and 12% of The Railway study samples were likely of African decent. Oxygen analysis identified four incomers, three from areas warmer than the UK and one from a cooler or more continental climate. Although based on a relatively small sample of the overall population at York, this multidisciplinary approach made it possible to identify incomers, both men and women, from across the Empire. Evidence for possible second generation migrants was also suggested. The results confirm the presence of a heterogeneous population resident in York and highlight the diversity, rather than the uniformity, of the population in Roman Britain. Am J Phys Anthropol 140:546-561, 2009. (C) 2009 Wiley-Liss, Inc

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within this paper modern techniques such as satellite image analysis and tools provided by geographic information systems (GIS.) are exploited in order to extend and improve existing techniques for mapping the spatial distribution of sediment transport processes. The processes of interest comprise mass movements such as solifluction, slope wash, dirty avalanches and rock- and boulder falls. They differ considerably in nature and therefore different approaches for the derivation of their spatial extent are required. A major challenge is addressing the differences between the comparably coarse resolution of the available satellite data (Landsat TM/ETM+, 30 in x 30 m) and the actual scale of sediment transport in this environment. A three-stepped approach has been developed which is based on the concept of Geomorphic Process Units (GPUs): parameterization, process area delineation and combination. Parameters include land cover from satellite data and digital elevation model derivatives. Process areas are identified using a hierarchical classification scheme utilizing thresholds and definition of topology. The approach has been developed for the Karkevagge in Sweden and could be successfully transferred to the Rabotsbekken catchment at Okstindan, Norway using similar input data. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we pledge that physically based equations should be combined with remote sensing techniques to enable a more theoretically rigorous estimation of area-average soil heat flux, G. A standard physical equation (i.e. the analytical or exact method) for the estimation of G, in combination with a simple, but theoretically derived, equation for soil thermal inertia (F), provides the basis for a more transparent and readily interpretable method for the estimation of G; without the requirement for in situ instrumentation. Moreover, such an approach ensures a more universally applicable method than those derived from purely empirical studies (employing vegetation indices and albedo, for example). Hence, a new equation for the estimation of Gamma(for homogeneous soils) is discussed in this paper which only requires knowledge of soil type, which is readily obtainable from extant soil databases and surveys, in combination with a coarse estimate of moisture status. This approach can be used to obtain area-averaged estimates of Gamma(and thus G, as explained in paper II) which is important for large-scale energy balance studies that employ aircraft or satellite data. Furthermore, this method also relaxes the instrumental demand for studies at the plot and field scale (no requirement for in situ soil temperature sensors, soil heat flux plates and/or thermal conductivity sensors). In addition, this equation can be incorporated in soil-vegetation-atmosphere-transfer models that use the force restore method to update surface temperatures (such as the well-known ISBA model), to replace the thermal inertia coefficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many currently available drugs show unfavourable physicochemical properties for delivery into or across the skin and temporary chemical modulation of the penetrant is one option to achieve improved delivery properties. Pro-drugs are chemical derivatives of an active drug which is covalently bonded to an inactive pro-moiety in order to overcome pharmaceutical and pharmacokinetic barriers. A pro-drug relies upon conversion within the body to release the parent active drug (and pro-moiety) to elicit its pharmacological effect. The main drawback of this approach is that the pro-moiety is essentially an unwanted ballast which, when released, can lead to adverse effects. The term ‘co-drug’ refers to two or more therapeutic compounds active against the same disease bonded via a covalent chemical linkage and it is this approach which is reviewed for the first time in the current article. For topically applied co-drugs, each moiety is liberated in situ, either chemically or enzymatically, once the stratum corneum barrier has been overcome by the co-drug. Advantages include synergistic modulation of the disease process, enhancement of drug delivery and pharmacokinetic properties and the potential to enhance stability by masking of labile functional groups. The amount of published work on co-drugs is limited but the available data suggest the co-drug concept could provide a significant therapeutic improvement in dermatological diseases. However, the applicability of the co-drug approach is subject to strict limitations pertaining mainly to the availability of compatible moieties and physicochemical properties of the overall molecule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes an empirical, user-centred approach to explanation design. It reports three studies that investigate what patients want to know when they have been prescribed medication. The question is asked in the context of the development of a drug prescription system called OPADE. The system is aimed primarily at improving the prescribing behaviour of physicians, but will also produce written explanations for indirect users such as patients. In the first study, a large number of people were presented with a scenario about a visit to the doctor, and were asked to list the questions that they would like to ask the doctor about the prescription. On the basis of the results of the study, a categorization of question types was developed in terms of how frequently particular questions were asked. In the second and third studies a number of different explanations were generated in accordance with this categorization, and a new sample of people were presented with another scenario and were asked to rate the explanations on a number of dimensions. The results showed significant differences between the different explanations. People preferred explanations that included items corresponding to frequently asked questions in study 1. For an explanation to be considered useful, it had to include information about side effects, what the medication does, and any lifestyle changes involved. The implications of the results of the three studies are discussed in terms of the development of OPADE's explanation facility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The SCoTLASS problem-principal component analysis modified so that the components satisfy the Least Absolute Shrinkage and Selection Operator (LASSO) constraint-is reformulated as a dynamical system on the unit sphere. The LASSO inequality constraint is tackled by exterior penalty function. A globally convergent algorithm is developed based on the projected gradient approach. The algorithm is illustrated numerically and discussed on a well-known data set. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Goal modelling is a well known rigorous method for analysing problem rationale and developing requirements. Under the pressures typical of time-constrained projects its benefits are not accessible. This is because of the effort and time needed to create the graph and because reading the results can be difficult owing to the effects of crosscutting concerns. Here we introduce an adaptation of KAOS to meet the needs of rapid turn around and clarity. The main aim is to help the stakeholders gain an insight into the larger issues that might be overlooked if they make a premature start into implementation. The method emphasises the use of obstacles, accepts under-refined goals and has new methods for managing crosscutting concerns and strategic decision making. It is expected to be of value to agile as well as traditional processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Students may have difficulty in understanding some of the complex concepts which they have been taught in the general areas of science and engineering. Whilst practical work such as a laboratory based examination of the performance of structures has an important role in knowledge construction this does have some limitations. Blended learning supports different learning styles, hence further benefits knowledge building. This research involves the empirical studies of how an innovative use of vodcasts (video-podcasts) can enrich learning experience in the structural properties of materials laboratory of an undergraduate course. Students were given the opportunity of downloading and viewing the vodcasts on the theory before and after the experimental work. It is the choice of the students when (before or after, before and after) and how many times they would like to view the vodcasts. In blended learning, the combination of face-to-face teaching, vodcasts, printed materials, practical experiments, writing reports and instructors’ feedbacks benefits different learning styles of the learners. For the preparation of the practical laboratory work, the students were informed about the availability of the vodcasts prior to the practical session. After the practical work, students submit an individual laboratory report for the assessment of the structures laboratory. The data collection consists of a questionnaire completed by the students, and the practical reports submitted by them for assessment. The results from the questionnaire were analysed quantitatively, whilst the data from the assessment reports were analysed qualitatively. The analysis shows that students who have not fully grasped the theory after the practical were successful in gaining the required knowledge by viewing the vodcasts. Some students who have understood the theory may choose to view it once or not at all. Their understanding was demonstrated by the quality of their explanations in their reports. This is illustrated by the approach they took to explicate the results of their experimental work, for example, they can explain how to calculate the Young’s Modulus properly and provided the correct value for it. The research findings are valuable to instructors who design, develop and deliver different types of blended learning, and beneficial to learners who try different blended approaches. Recommendations were made on the role of the innovative application of vodcasts in the knowledge construction for structures laboratory and to guide future work in this area of research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper exploits a structural time series approach to model the time pattern of multiple and resurgent food scares and their direct and cross-product impacts on consumer response. A structural time series Almost Ideal Demand System (STS-AIDS) is embedded in a vector error correction framework to allow for dynamic effects (VEC-STS-AIDS). Italian aggregate household data on meat demand is used to assess the time-varying impact of a resurgent BSE crisis (1996 and 2000) and the 1999 Dioxin crisis. The VEC-STS-AIDS model monitors the short-run impacts and performs satisfactorily in terms of residuals diagnostics, overcoming the major problems encountered by the customary vector error correction approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.