914 resultados para scenario-based assessment
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
The purpose of this study is to explore how scenarios can be exploited in strategic assessment of the external business environment. One of the key challenges for managers worldwide is to adapt their businesses to the ever-changing business environment. As the companies’ external business environment is constantly presenting new opportunities and threats, it is extremely important that companies continuously monitor the possible changes happening around it. As the speed of change rises, assessing the future has become more and more vital. The study was conducted as an exploratory research and the research strategy was influenced by scenario planning and case study strategy. The study examined the European pet food sector from the future point of view. Qualitative study was chosen as research approach and empirical data was collected primarily by seven expert interviews. The secondary data about the sector was applied as complementary empirical data. In the theoretical part of the research it was discovered that nowadays, traditional analysis frameworks are ill-suited for strategic assessment of the external business environment. This is why a self-created combination framework for analysis was employed both as study’s theoretical framework and analysis technique. Furthermore, the framework formed the base for interview questions. Both in theoretical and the empirical part of the study it was found that today, in strategic assessment of the external business environment, besides setting focus on the current situation, it is important to concentrate also on the future. The traditional analysis frameworks offer a good starting point for collecting relevant data but they do not encourage conducting a deeper analysis. By adding characteristics from scenario planning to these more traditional tools, a new analysis framework was created, which ensured the more comprehensive analysis. By understanding the interconnections between discovered phenomena and changes, and by recognizing uncertainties, the user is helped to reflect the environment more profoundly. The contributions of the study are both theoretical and managerial. A new analysis framework strives to answer to the current needs for strategic assessment of external business environment and the framework was tested in the context of European pet food sector. When it comes to managerial contributions, the importance lies in understanding the future. Managers must take future into account and understand that future includes various possibilities which all must be reflected
Resumo:
Abstract A noted benefit of Project Based Learning (PBL) as a teaching strategy is how it engages the student and enhances learning outcomes as a result of working through challenges intended to depict dilemmas outside the classroom. PBL has seldom been applied outside the parameters of the classroom curriculum. The current needs assessment carried out in this research project examined current practices of language instruction and International Administrative Professionals of both the private and public Language Industry. Participants responded to survey questions on their current administrative practices, strategies, and program characteristics. The study investigated the usefulness of a handbook on the procedure of assisting administrative service teams in language instruction settings to an engaged approach to PBL for student service issues. The diverse opinions, beliefs, and ideas, along with institutional policy, can provide beneficial framework ideas for future tools.
Resumo:
La maladie de Parkinson (PD) a été uniquement considérée pour ses endommagements sur les circuits moteurs dans le cerveau. Il est maintenant considéré comme un trouble multisystèmique, avec aspects multiples non moteurs y compris les dommages intérêts pour les circuits cognitifs. La présence d’un trouble léger de la cognition (TCL) de PD a été liée avec des changements structurels de la matière grise, matière blanche ainsi que des changements fonctionnels du cerveau. En particulier, une activité significativement réduite a été observée dans la boucle corticostriatale ‘cognitive’ chez des patients atteints de PD-TCL vs. PD non-TCL en utilisant IRMf. On sait peu de cours de ces modèles fonctionnels au fil du temps. Dans cette étude, nous présentons un suivi longitudinal de 24 patients de PD non démente qui a subi une enquête neuropsychologique, et ont été séparés en deux groupes - avec et sans TCL (TCL n = 11, non-TCL n = 13) en fonction du niveau 2 des recommandations de la Movement Disrders Society pour le diagnostic de PD-TCL. Ensuite, chaque participant a subi une IRMf en effectuant la tâche de Wisconsin pendant deux sessions, 19 mois d'intervalle. Nos résultats longitudinaux montrent qu'au cours de la planification de période de la tâche, les patients PD non-TCL engageant les ressources normales du cortex mais ils ont activé en plus les zones corticales qui sont liés à la prise de décision tel que cortex médial préfrontal (PFC), lobe pariétal et le PFC supérieure, tandis que les PD-TCL ont échoué pour engager ces zones en temps 2. Le striatum n'était pas engagé pour les deux groupes en temps 1 et pour le groupe TCL en temps 2. En outre, les structures médiales du lobe temporal étaient au fil du temps sous recrutés pour TCL et Non-TCL et étaient positivement corrélés avec les scores de MoCA. Le cortex pariétal, PFC antérieur, PFC supérieure et putamen postérieur étaient négativement corrélés avec les scores de MoCA en fil du temps. Ces résultats révèlent une altération fonctionnelle pour l’axe ganglial-thalamo-corticale au début de PD, ainsi que des niveaux différents de participation corticale pendant une déficience cognitive. Cette différence de recrutement corticale des ressources pourrait refléter longitudinalement des circuits déficients distincts de trouble cognitive légère dans PD.
Resumo:
Cochin estuarine system is among the most productive aquatic environment along the Southwest coast of India, exhibits unique ecological features and possess greater socioeconomic relevance. Serious investigations carried out during the past decades on the hydro biogeochemical variables pointed out variations in the health and ecological functioning of this ecosystem. Characterisation of organic matter in the estuary has been attempted in many investigations. But detailed studies covering the degradation state of organic matter using molecular level approach is not attempted. The thesis entitled Provenance, Isolation and Characterisation of Organic Matter in the Cochin Estuarine Sediment-“ A Diagenetic Amino Acid Marker Scenario” is an integrated approach to evaluate the source, quantity, quality, and degradation state of the organic matter in the surface sediments of Cochin estuarine system with the combined application of bulk and molecular level tools. Sediment and water samples from nine stations situated at Cochin estuary were collected in five seasonal sampling campaigns, for the biogeochemical assessment and their distribution pattern of sedimentary organic matter. The sampling seasons were described and abbreviated as follows: April- 2009 (pre monsoon: PRM09), August-2009 (monsoon: MON09), January-2010 (post monsoon: POM09), April-2010 (pre monsoon: PRM10) and September- 2012 (monsoon: MON12). In order to evaluate the general environmental conditions of the estuary, water samples were analysed for water quality parameters, chlorophyll pigments and nutrients by standard methods. Investigations suggested the fact that hydrographical variables and nutrients in Cochin estuary supports diverse species of flora and fauna. Moreover the sedimentary variables such as pH, Eh, texture, TOC, fractions of nitrogen and phosphorous were determined to assess the general geochemical setting as well as redox status. The periodically fluctuating oxic/ anoxic conditions and texture serve as the most significant variables controlling other variables of the aquatic environment. The organic matter in estuary comprise of a complex mixture of autochthonous as well as allochthonous materials. Autochthonous input is limited or enhanced by the nutrient elements like N and P (in their various fractions), used as a tool to evaluate their bioavailability. Bulk parameter approach like biochemical composition, stoichiometric elemental ratios and stable carbon isotope ratio was also employed to assess the quality and quantity of sedimentary organic matter in the study area. Molecular level charactersation of free sugars and amino acids were carried out by liquid chromatographic techniques. Carbohydrates are the products of primary production and their occurrence in sediments as free sugars can provide information on the estuarine productivity. Amino acid biogeochemistry provided implications on the system productivity, nature of organic matter as well as degradation status of the sedimentary organic matter in the study area. The predominance of carbohydrates over protein indicated faster mineralisation of proteinaceous organic matter in sediments and the estuary behaves as a detrital trap for the accumulation of aged organic matter. The higher lipid content and LPD/CHO ratio pointed towards the better food quality that supports benthic fauna and better accumulation of lipid compounds in the sedimentary environment. Allochthonous addition of carbohydrates via terrestrial run off was responsible for the lower PRT/CHO ratio estimated in thesediments and the lower ratios also denoted a detrital heterotrophic environment. Biopolymeric carbon and the algal contribution to BPC provided important information on the better understanding the trophic state of the estuarine system and the higher values of chlorophyll-a to phaeophytin ratio indicated deposition of phytoplankton to sediment at a rapid rate. The estimated TOC/TN ratios implied the combined input of both terrestrial and autochthonous organic matter to sedimentsAmong the free sugars, depleted levels of glucose in sediments in most of the stations and abundance of mannose at station S5 was observed during the present investigation. Among aldohexoses, concentration of galactose was found to be higher in most of the stationsRelative abundance of AAs in the estuarine sediments based on seasons followed the trend: PRM09-Leucine > Phenylalanine > Argine > Lysine, MON09-Lysine > Aspartic acid > Histidine > Tyrosine > Phenylalanine, POM09-Lysine > Histadine > Phenyalanine > Leucine > Methionine > Serine > Proline > Aspartic acid, PRM10-Valine > Aspartic acid > Histidine > Phenylalanine > Serine > Proline, MON12-Lysine > Phenylalanine > Aspartic acid > Histidine > Valine > Tyrsine > MethionineThe classification of study area into three zones based on salinity was employed in the present study for the sake of simplicity and generalized interpretations. The distribution of AAs in the three zones followed the trend: Fresh water zone (S1, S2):- Phenylalanine > Lysine > Aspartic acid > Methionine > Valine ῀ Leucine > Proline > Histidine > Glycine > Serine > Glutamic acid > Tyrosine > Arginine > Alanine > Threonine > Cysteine > Isoleucine. Estuarine zone (S3, S4, S5, S6):- Lysine > Aspartic acid > Phenylalanine > Leucine > Valine > Histidine > Methionine > Tyrosine > Serine > Glutamic acid > Proline > Glycine > Arginine > Alanine > Isoleucine > Cysteine > Threonine. Riverine /Industrial zone (S7, S8, S9):- Phenylalanine > Lysine > Aspartic acid > Histidine > Serine > Arginine > Tyrosine > Leucine > Methionine > Glutamic acid > Alanine > Glycine > Cysteine > Proline > Isoleucine > Threonine > Valine. The abundance of AAs like glutamic acid, aspartic acid, isoleucine, valine, tyrosine, and phenylalanine in sediments of the study area indicated freshly derived organic matter.
Resumo:
This is the scenario used in the course when individuals are unable to participate in group work. It includes a written case study of a diary farm, and several news articles that describe various views on the business of dairy farming.
Resumo:
Resumen basado en el de la publicaci??n
Resumo:
ABSRACT This thesis focuses on the monitoring, fault detection and diagnosis of Wastewater Treatment Plants (WWTP), which are important fields of research for a wide range of engineering disciplines. The main objective is to evaluate and apply a novel artificial intelligent methodology based on situation assessment for monitoring and diagnosis of Sequencing Batch Reactor (SBR) operation. To this end, Multivariate Statistical Process Control (MSPC) in combination with Case-Based Reasoning (CBR) methodology was developed, which was evaluated on three different SBR (pilot and lab-scales) plants and validated on BSM1 plant layout.
Resumo:
La investigació que es presenta en aquesta tesi es centra en l'aplicació i millora de metodologies analítiques existents i el desenvolupament de nous procediments que poden ser utilitzats per a l'estudi dels efectes ambientals de la dispersió dels metalls entorn a les zones mineres abandonades. En primer lloc, es van aplicar diferents procediments d'extracció simple i seqüencial per a estudiar la mobilitat, perillositat i bio-disponibilitat dels metalls continguts en residus miners de característiques diferents. Per altra banda, per a estudiar les fonts potencials de Pb en la vegetació de les zones mineres d'estudi, una metodologia basada en la utilització de les relacions isotòpiques de Pb determinades mitjançant ICP-MS va ser avaluada. Finalment, tenint en compte l'elevat nombre de mostres analitzades per a avaluar l'impacte de les activitats mineres, es va considerar apropiat el desenvolupament de mètodes analítics d'elevada productivitat. En aquest sentit la implementació d'estratègies quantitatives així com l'aplicació de les millores instrumentals en els equips de XRF han estat avaluades per a aconseguir resultats analítics fiables en l'anàlisi de plantes. A més, alguns paràmetres de qualitat com la precisió, l'exactitud i els límits de detecció han estat curosament determinats en les diverses configuracions de espectròmetres de XRF utilitzats en el decurs d'aquest treball (EDXRF, WDXRF i EDPXRF) per a establir la capacitat de la tècnica de XRF com a tècnica alternativa a les clàssiques comunament aplicades en la determinació d'elements en mostres vegetals.
Resumo:
La idea básica de detección de defectos basada en vibraciones en Monitorización de la Salud Estructural (SHM), es que el defecto altera las propiedades de rigidez, masa o disipación de energía de un sistema, el cual, altera la respuesta dinámica del mismo. Dentro del contexto de reconocimiento de patrones, esta tesis presenta una metodología híbrida de razonamiento para evaluar los defectos en las estructuras, combinando el uso de un modelo de la estructura y/o experimentos previos con el esquema de razonamiento basado en el conocimiento para evaluar si el defecto está presente, su gravedad y su localización. La metodología involucra algunos elementos relacionados con análisis de vibraciones, matemáticas (wavelets, control de procesos estadístico), análisis y procesamiento de señales y/o patrones (razonamiento basado en casos, redes auto-organizativas), estructuras inteligentes y detección de defectos. Las técnicas son validadas numérica y experimentalmente considerando corrosión, pérdida de masa, acumulación de masa e impactos. Las estructuras usadas durante este trabajo son: una estructura tipo cercha voladiza, una viga de aluminio, dos secciones de tubería y una parte del ala de un avión comercial.
Resumo:
Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.
Resumo:
Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.
Resumo:
Shallow groundwater beneath a former airfield site in southern England has been heavily contaminated with a wide range of chlorinated solvents. The feasibility of using bacterial biosensors to complement chemical analysis and enable cost-effective, and focussed sampling has been assessed as part of a site evaluation programme. Five different biosensors, three metabolic (Vibrio fischeri, Pseudomonas fluorescens 10568 and Escherichia coli HB101) and two catabolic (Pseudomonas putida TVA8 and E. coli DH5alpha), were employed to identify areas where the availability and toxicity of pollutants is of most immediate environmental concern. The biosensors used showed different sensitivities to each other and to the groundwater samples tested. There was generally a good agreement with chemical analyses. The potential efficacy of remediation strategies was explored by coupling sample manipulation to biosensor tests. Manipulation involved sparging and charcoal treatment procedures to simulate remediative engineering solutions. Sparging was sufficient at most locations. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Calculations of the absorption of solar radiation by atmospheric gases, and water vapor in particular, are dependent on the quality of databases of spectral line parameters. There has been increasing scrutiny of databases such as HITRAN in recent years, but this has mostly been performed on a band-by-band basis. We report nine high-spectral-resolution (0.03 cm(-1)) measurements of the solar radiation reaching the surface in southern England over the wave number range 2000 to 12,500 cm(-1) (0.8 to 5 mm) that allow a unique assessment of the consistency of the spectral line databases over this entire spectral region. The data are assessed in terms of the modeled water vapor column that is required to bring calculations and observations into agreement; for an entirely consistent database, this water vapor column should be constant with frequency. For the HITRAN01 database, the spread in water vapor column is about 11%, with distinct shifts between different spectral regions. The HITRAN04 database is in significantly better agreement (about 5% spread) in the completely updated 3000 to 8000 cm(-1) spectral region, but inconsistencies between individual spectral regions remain: for example, in the 8000 to 9500 cm(-1) spectral region, the results indicate an 18% (+/- 1%) underestimate in line intensities with respect to the 3000 to 8000 cm(-1) region. These measurements also indicate the impact of isotopic fractionation of water vapor in the 2500 to 2900 cm(-1) range, where HDO lines dominate over the lines of the most abundant isotope of H2O.