965 resultados para operational parameters


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Handbook is designed to outline the purposes, goals, structure, and operational procedures for Iowa’s Child Welfare Decategorization Program. The Handbook incorporates experiences gained since the inception of Decategorization in 1987. As with any initiative that began on a pilot basis, Decategorization has been an evolving program in which parameters and procedures have undergone modifications to achieve the desired results. The Handbook serves as a guidebook for implementation and operation of Decategorization and a means of communicating information on program parameters and procedures. Purposes of Decategorization of child welfare and juvenile justice funding is an initiative intended to establish systems of delivering human services based upon client needs to replace systems based upon a multitude of categorical funding programs and funding sources, each with different service definitions and eligibility requirements. Decategorization is designed to redirect child welfare and juvenile justice funding to services which are more preventive, family centered, and community-based in order to reduce use of restrictive approaches that rely on institutional, out-of home, and out-of-community care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The statistical theory of signal detection and the estimation of its parameters are reviewed and applied to the case of detection of the gravitational-wave signal from a coalescing binary by a laser interferometer. The correlation integral and the covariance matrix for all possible static configurations are investigated numerically. Approximate analytic formulas are derived for the case of narrow band sensitivity configuration of the detector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The planning effort for ISP began in 2006 when the IDOC retained the Durrant/PBA team of architects and planners to review the Iowa correctional system. The team conducted two studies in the following two years, the first being the April 2007 Iowa Department of Corrections Systemic Master Plan. Both studies addressed myriad aspects of the correctional system including treatment and re-entry needs and programs, security and training, and staffing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les investigations dans le milieu des accidents de la circulation sont très complexes. Elles nécessitent la mise en oeuvre d'un grand nombre de spécialités venant de domaines très différents. Si certains de ces domaines sont déjà bien exploités, d'autres demeurent encore incomplets et il arrive de nos jours d'observer des lacunes dans la pratique, auxquelles il est primordial de remédier.Ce travail de thèse, intitulé « l'exploitation des traces dans les accidents de la circulation », est issu d'une réflexion interdisciplinaire entre de multiples aspects des sciences forensiques. Il s'agit principalement d'une recherche ayant pour objectif de démontrer les avantages découlant d'une synergie entre les microtraces et l'étude de la dynamique d'un accident. Afin de donner une dimension très opérationnelle à ce travail, l'ensemble des démarches entreprises a été axé de manière à optimiser l'activité des premiers intervenants sur les lieux.Après une partie introductive et ayant trait au projet de recherche, traitant des aspects théoriques de la reconstruction d'une scène d'accident, le lecteur est invité à prendre connaissance de cinq chapitres pratiques, abordés selon la doctrine « du général au particulier ». La première étape de cette partie pratique concerne l'étude de la morphologie des traces. Des séquences d'examens sont proposées pour améliorer l'interprétation des contacts entre véhicules et obstacles impliqués dans un accident. Les mécanismes de transfert des traces de peinture sont ensuite étudiés et une série de tests en laboratoire est pratiquée sur des pièces de carrosseries automobiles. Différents paramètres sont ainsi testés afin de comprendre leur impact sur la fragilité d'un système de peinture. Par la suite, une liste de cas traités (crash-tests et cas réels), apportant des informations intéressantes sur le traitement d'une affaire et permettant de confirmer les résultats obtenus est effectuée. Il s'ensuit un recueil de traces, issu de l'expérience pratique acquise et ayant pour but d'aiguiller la recherche et le prélèvement sur les lieux. Finalement, la problématique d'une banque de données « accident », permettant une gestion optimale des traces récoltées est abordée.---The investigations of traffic accidents are very complex. They require the implementation of a large number of specialties coming from very different domains. If some of these domains are already well exploited, others remain still incomplete and it happens nowadays to observe gaps in the practice, which it is essential to remedy. This thesis, entitled "the exploitation of traces in traffic accidents", arises from a multidisciplinary reflection between the different aspects of forensic science. It is primarily a research aimed to demonstrate the benefits of synergy between microtrace evidence and accidents dynamics. To give a very operational dimension to this work, all the undertaken initiatives were centred so as to optimise the activity of the first participants on the crime scene.After an introductory part treating theoretical aspects of the reconstruction of an accident scene the reader is invited to get acquainted with five practical chapters, according to the doctrine "from general to particular". For the first stage of this practical part, the problem of the morphology of traces is approached and sequences of examinations are proposed to improve the interpretation of the contacts between vehicles and obstacles involved in an accident. Afterwards, the mechanisms of transfer of traces of paint are studied and a series of tests in laboratory is practised on pieces of automobile bodies. Various parameters are thus tested to understand their impact on the fragility of a system of paint. It follows that a list of treated cases (crash-tests and real cases) is created, allowing to bring interesting information on the treatment of a case and confirm the obtained results. Then, this work goes on with a collection of traces, stemming from the acquired experience that aims to steer the research and the taking of evidence on scenes. Finally, the practical part of this thesis ends with the problem of a database « accident », allowing an optimal management of the collected traces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The slow-phase velocity of nystagmus is one of the most sensitive parameters of vestibular function and is currently the standard for evaluating the caloric test. However, the assessment of this parameter requires recording the response by using nystagmography. The aim of this study was to evaluate whether frequency and duration of the caloric nystagmus, as measured by using a clinical test with Frenzel glasses, could predict the result of the recorded test. The retrospective analysis of 222 caloric test results recorded by means of electronystagmography has shown a good association between the 3 parameters for unilateral weakness. The asymmetry observed in the velocity can be predicted by a combination of frequency and duration. On the other hand, no relationship was observed between the parameters for directional preponderance. These results indicate that a clinical caloric test with frequency and duration as parameters can be used to predict the unilateral weakness, which would be obtained by use of nystagmography. We propose an evaluation of the caloric test on the basis of diagrams combining the 3 response parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The magnetic coupling constant of selected cuprate superconductor parent compounds has been determined by means of embedded cluster model and periodic calculations carried out at the same level of theory. The agreement between both approaches validates the cluster model. This model is subsequently employed in state-of-the-art configuration interaction calculations aimed to obtain accurate values of the magnetic coupling constant and hopping integral for a series of superconducting cuprates. Likewise, a systematic study of the performance of different ab initio explicitly correlated wave function methods and of several density functional approaches is presented. The accurate determination of the parameters of the t-J Hamiltonian has several consequences. First, it suggests that the appearance of high-Tc superconductivity in existing monolayered cuprates occurs with J/t in the 0.20¿0.35 regime. Second, J/t=0.20 is predicted to be the threshold for the existence of superconductivity and, third, a simple and accurate relationship between the critical temperatures at optimum doping and these parameters is found. However, this quantitative electronic structure versus Tc relationship is only found when both J and t are obtained at the most accurate level of theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the bridging ligand on the effective Heisenberg coupling parameters is analyzed in detail. This analysis strongly suggests that the ligand-to-metal charge transfer excitations are responsible for a large part of the final value of the magnetic coupling constant. This permits us to suggest a variant of the difference dedicated configuration interaction (DDCI) method, presently one of the most accurate and reliable for the evaluation of magnetic effective interactions. This method treats the bridging ligand orbitals mediating the interaction at the same level than the magnetic orbitals and preserves the high quality of the DDCI results while being much less computationally demanding. The numerical accuracy of the new approach is illustrated on various systems with one or two magnetic electrons per magnetic center. The fact that accurate results can be obtained using a rather reduced configuration interaction space opens the possibility to study more complex systems with many magnetic centers and/or many electrons per center.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Excessive daytime sleepiness underpins a large number of the reported motor vehicle crashes. Fair and accurate field measures are needed to identify at-risk drivers who have been identified as potentially driving in a sleep deprived state on the basis of erratic driving behavior. The purpose of this research study was to evaluate a set of cognitive tests that can assist Motor Vehicle Enforcement Officers on duty in identifying drivers who may be engaged in sleep impaired driving. Currently no gold standard test exists to judge sleepiness in the field. Previous research has shown that Psychomotor Vigilance Task (PVT) is sensitive to sleep deprivation. The first goal of the current study was to evaluate whether computerized tests of attention and memory, more brief than PVT, would be as sensitive to sleepiness effects. The second goal of the study was to evaluate whether objective and subjective indices of acute and cumulative sleepiness predicted cognitive performance. Findings showed that sleepiness effects were detected in three out of six tasks. Furthermore, PVT was the only task that showed a consistent slowing of both ‘best’, i.e. minimum, and ‘typical’ responses, median RT due to sleepiness. However, PVT failed to show significant associations with objective measures of sleep deprivation (number of hours awake). The findings indicate that sleepiness tests in the field have significant limitations. The findings clearly show that it will not be possible to set absolute performance thresholds to identify sleep-impaired drivers based on cognitive performance on any test. Cooperation with industry to adjust work and rest cycles, and incentives to comply with those regulations will be critical components of a broad policy to prevent sleepy truck drivers from getting on the road.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Application of semi-distributed hydrological models to large, heterogeneous watersheds deals with several problems. On one hand, the spatial and temporal variability in catchment features should be adequately represented in the model parameterization, while maintaining the model complexity in an acceptable level to take advantage of state-of-the-art calibration techniques. On the other hand, model complexity enhances uncertainty in adjusted model parameter values, therefore increasing uncertainty in the water routing across the watershed. This is critical for water quality applications, where not only streamflow, but also a reliable estimation of the surface versus subsurface contributions to the runoff is needed. In this study, we show how a regularized inversion procedure combined with a multiobjective function calibration strategy successfully solves the parameterization of a complex application of a water quality-oriented hydrological model. The final value of several optimized parameters showed significant and consistentdifferences across geological and landscape features. Although the number of optimized parameters was significantly increased by the spatial and temporal discretization of adjustable parameters, the uncertainty in water routing results remained at reasonable values. In addition, a stepwise numerical analysis showed that the effects on calibration performance due to inclusion of different data types in the objective function could be inextricably linked. Thus caution should be taken when adding or removing data from an aggregated objective function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was undertaken to study the relationships between the performance of locally available asphalts and their physicochemical properties under Iowa conditions with the ultimate objective of development of a locally and performance-based asphalt specification for durable pavements. Physical and physicochemical tests were performed on three sets of asphalt samples including: (a) twelve samples from local asphalt suppliers and their TFOT residues, (b) six core samples of known service records, and (c) a total of 79 asphalts from 10 pavement projects including original, lab aged and recovered asphalts from field mixes, as well as from lab aged mixes. Tests included standard rheological tests, HP-GPC and TMA. Some specific viscoelastic tests (at 5 deg C) were run on b samples and on some a samples. DSC and X-ray diffraction studies were performed on a and b samples. Furthermore, NMR techniques were applied to some a, b and c samples. Efforts were made to identify physicochemical properties which are correlated to physical properties known to affect field performance. The significant physicochemical parameters were used as a basis for an improved performance-based trial specification for Iowa to ensure more durable pavements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An expert system has been developed that provides 24 hour forecasts of roadway and bridge frost for locations in Iowa. The system is based on analysis of frost observations taken by highway maintenance personnel, analysis of conditions leading to frost as obtained from meteorologists with experience in forecasting bridge and roadway frost, and from fundamental physical principles of frost processes. The expert system requires the forecaster to enter information on recent maximum and minimum temperatures and forecasts of maximum and minimum air temperatures, dew point temperatures, precipitation, cloudiness, and wind speed. The system has been used operationally for the last two frost seasons by Freese-Notis Associates, who have been under contract with the Iowa DOT to supply frost forecasts. The operational meteorologists give the system their strong endorsement. They always consult the system before making a frost forecast unless conditions clearly indicate frost is not likely. In operational use, the system is run several times with different input values to test the sensitivity of frost formation on a particular day to various meteorological parameters. The users comment. that the system helps them to consider all the factors relevant to frost formation and is regarded as an office companion for making frost forecasts.