906 resultados para Autoregressive-Moving Average model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors screened 34 large cattle herds for the presence of Mycoplasma bovis infection by examining slaughtered cattle for macroscopic lung lesions, by culturing M. bovis from lung lesions and at the same time by testing sera for the presence of antibodies against M. bovis. Among the 595 cattle examined, 33.9% had pneumonic lesions, mycoplasmas were isolated from 59.9% of pneumonic lung samples, and 10.9% of sera from those animals contained antibodies to M.bovis. In 25.2% of the cases M. bovis was isolated from lungs with no macroscopic lesions. The proportion of seropositive herds was 64.7%. The average seropositivity rate of individuals was 11.3% but in certain herds it exceeded 50%. A probability model was developed for examining the relationship among the occurrence of pneumonia, the isolation of M. bovis from the lungs and the presence of M. bovis specific antibodies in sera.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The freshman year is the most critical year of matriculation for students in higher education. One in four freshman students drops out of higher education after the first year. In fact, the first two to six weeks of college represent a very critical transition period when students make the decision to persist or depart from the institution. Many students leave because they are unable to make a connection with the institution. Retention is often profoundly affected by student involvement in the academic environment, satisfaction with the campus climate and the institution's response to diversity. Therefore, the purpose of this study was to examine and evaluate an effective institutional response that promotes freshman retention and academic success. The tenets (diversity training, conflict management, and community building) of a mentoring model were applied to the freshman experience seminar class (experimental group) as a pedagogical method of instruction to determine its efficacy as a retention initiative when compared with the traditional freshman experience seminar class (comparison group). ^ The quantitative study employed a quasi-experimental research design based on Astin's (1993) I-E-O model. The model examined the relationships between the characteristics students bring with them to college, called inputs, their experiences in the environment during college, and the outcomes students achieved during matriculation. Fifty-two students enrolled in the freshman seminar class participated in the study. ^ Demographic data and input variables between groups were analyzed using chi-square, t-tests and multivariate analyses. Overall, students in the experimental group had significantly higher satisfaction (campus climate) scores than the comparison group. An analysis of the students' willingness to interact with others from diverse groups indicated a significant difference between groups, with the experimental group scoring higher than the comparison group. Students in the experimental group were significantly more involved in campus activities than students in the comparison group. No significant differences were found between groups relative to the mean grade point average and re-enrollment for fall semester 2001. ^ While the mentoring model did not directly affect re-enrollment of students, the model did promote student satisfaction with the institution, an appreciation for diversity of contact and it encouraged involvement in the campus community. These are all essential outcomes of a quality retention program. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier-Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital systems can generate left and right audio channels that create the effect of virtual sound source placement (spatialization) by processing an audio signal through pairs of Head-Related Transfer Functions (HRTFs) or, equivalently, Head-Related Impulse Responses (HRIRs). The spatialization effect is better when individually-measured HRTFs or HRIRs are used than when generic ones (e.g., from a mannequin) are used. However, the measurement process is not available to the majority of users. There is ongoing interest to find mechanisms to customize HRTFs or HRIRs to a specific user, in order to achieve an improved spatialization effect for that subject. Unfortunately, the current models used for HRTFs and HRIRs contain over a hundred parameters and none of those parameters can be easily related to the characteristics of the subject. This dissertation proposes an alternative model for the representation of HRTFs, which contains at most 30 parameters, all of which have a defined functional significance. It also presents methods to obtain the value of parameters in the model to make it approximately equivalent to an individually-measured HRTF. This conversion is achieved by the systematic deconstruction of HRIR sequences through an augmented version of the Hankel Total Least Squares (HTLS) decomposition approach. An average 95% match (fit) was observed between the original HRIRs and those re-constructed from the Damped and Delayed Sinusoids (DDSs) found by the decomposition process, for ipsilateral source locations. The dissertation also introduces and evaluates an HRIR customization procedure, based on a multilinear model implemented through a 3-mode tensor, for mapping of anatomical data from the subjects to the HRIR sequences at different sound source locations. This model uses the Higher-Order Singular Value Decomposition (HOSVD) method to represent the HRIRs and is capable of generating customized HRIRs from easily attainable anatomical measurements of a new intended user of the system. Listening tests were performed to compare the spatialization performance of customized, generic and individually-measured HRIRs when they are used for synthesized spatial audio. Statistical analysis of the results confirms that the type of HRIRs used for spatialization is a significant factor in the spatialization success, with the customized HRIRs yielding better results than generic HRIRs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier–Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bedforms such as dunes and ripples are ubiquitous in rivers and coastal seas, and commonly described as triangular shapes from which height and length are calculated to estimate hydrodynamic and sediment dynamic parameters. Natural bedforms, however, present a far more complicated morphology; the difference between natural bedform shape and the often assumed triangular shape is usually neglected, and how this may affect the flow is unknown. This study investigates the shapes of natural bedforms and how they influence flow and shear stress, based on four datasets extracted from earlier studies on two rivers (the Rio Paraná in Argentina, and the Lower Rhine in The Netherlands). The most commonly occurring morphological elements are a sinusoidal stoss side made of one segment and a lee side made of two segments, a gently sloping upper lee side and a relatively steep (6 to 21°) slip face. A non-hydrostatic numerical model, set up using Delft3D, served to simulate the flow over fixed bedforms with various morphologies derived from the identified morphological elements. Both shear stress and turbulence increase with increasing slip face angle and are only marginally affected by the dimensions and positions of the upper and lower lee side. The average slip face angle determined from the bed profiles is 14°, over which there is no permanent flow separation. Shear stress and turbulence above natural bedforms are higher than above a flat bed but much lower than over the often assumed 30° lee side angle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In several areas of health professionals (pediatricians, nutritionists, orthopedists, endocrinologists, dentists, etc.) are used in the assessment of bone age to diagnose growth disorders in children. Through interviews with specialists in diagnostic imaging and research done in the literature, we identified the TW method - Tanner and Whitehouse as the most efficient. Even achieving better results than other methods, it is still not the most used, due to the complexity of their use. This work presents the possibility of automation of this method and therefore that its use more widespread. Also in this work, they are met two important steps in the evaluation of bone age, identification and classification of regions of interest. Even in the radiography in which the positioning of the hands were not suitable for TW method, the identification algorithm of the fingers showed good results. As the use AAM - Active Appearance Models showed good results in the identification of regions of interest even in radiographs with high contrast and brightness variation. It has been shown through appearance, good results in the classification of the epiphysis in their stages of development, being chosen the average epiphysis finger III (middle) to show the performance. The final results show an average percentage of 90% hit and misclassified, it was found that the error went away just one stage of the correct stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyzed projections of current and future ambient temperatures along the eastern United States in relationship to the thermal tolerance of harbor seals in air. Using the earth systems model (HadGEM2-ES) and representative concentration pathways (RCPs) 4.5 and 8.5, which are indicative of two different atmospheric CO2 concentrations, we were able to examine possible shifts in distribution based on three metrics: current preferences, the thermal limit of juveniles, and the thermal limits of adults. Our analysis focused on average ambient temperatures because harbor seals are least effective at regulating their body temperature in air, making them most susceptible to rising air temperatures in the coming years. Our study focused on the months of May, June, and August from 2041-2060 (2050) and 2061-2080 (2070) as these are the historic months in which harbor seals are known to annually come ashore to pup, breed, and molt. May, June, and August are also some of the warmest months of the year. We found that breeding colonies along the eastern United States will be limited by the thermal tolerance of juvenile harbor seals in air, while their foraging range will extend as far south as the thermal tolerance of adult harbor seals in air. Our analysis revealed that in 2070, harbor seal pups should be absent from the United States coastline nearing the end of the summer due to exceptionally high air temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea ice models contain many different parameterizations of which one of the most commonly used is a subgrid-scale ice thickness distribution (ITD). The effect of this model component and the associated ice strength formulation on the reproduction of observed Arctic sea ice is assessed. To this end the model's performance in reproducing satellite observations of sea ice concentration, thickness and drift is evaluated. For an unbiased comparison, different model configurations with and without an ITD are tuned with an automated parameter optimization. The original combination of ITD and ice strength parameterization does not lead to better results than a simple single category model. Yet changing to a simpler ice strength formulation, which depends linearly on the mean ice thickness across all thickness categories, allows to clearly improve the model-data misfit when using an ITD. In the original formulation, the ice strength depends strongly on the number of thickness categories, so that introducing more categories can lead to thicker albeit weaker ice on average.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shape-based registration methods frequently encounters in the domains of computer vision, image processing and medical imaging. The registration problem is to find an optimal transformation/mapping between sets of rigid or nonrigid objects and to automatically solve for correspondences. In this paper we present a comparison of two different probabilistic methods, the entropy and the growing neural gas network (GNG), as general feature-based registration algorithms. Using entropy shape modelling is performed by connecting the point sets with the highest probability of curvature information, while with GNG the points sets are connected using nearest-neighbour relationships derived from competitive hebbian learning. In order to compare performances we use different levels of shape deformation starting with a simple shape 2D MRI brain ventricles and moving to more complicated shapes like hands. Results both quantitatively and qualitatively are given for both sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The neoliberal period was accompanied by a momentous transformation within the US health care system.  As the result of a number of political and historical dynamics, the healthcare law signed by President Barack Obama in 2010 ‑the Affordable Care Act (ACA)‑ drew less on universal models from abroad than it did on earlier conservative healthcare reform proposals. This was in part the result of the influence of powerful corporate healthcare interests. While the ACA expands healthcare coverage, it does so incompletely and unevenly, with persistent uninsurance and disparities in access based on insurance status. Additionally, the law accommodates an overall shift towards a consumerist model of care characterized by high cost sharing at time of use. Finally, the law encourages the further consolidation of the healthcare sector, for instance into units named “Accountable Care Organizations” that closely resemble the health maintenance organizations favored by managed care advocates. The overall effect has been to maintain a fragmented system that is neither equitable nor efficient. A single payer universal system would, in contrast, help transform healthcare into a social right.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A modified UNIFAC–VISCO group contribution method was developed for the correlation and prediction of viscosity of ionic liquids as a function of temperature at 0.1 MPa. In this original approach, cations and anions were regarded as peculiar molecular groups. The significance of this approach comes from the ability to calculate the viscosity of mixtures of ionic liquids as well as pure ionic liquids. Binary interaction parameters for selected cations and anions were determined by fitting the experimental viscosity data available in literature for selected ionic liquids. The temperature dependence on the viscosity of the cations and anions were fitted to a Vogel–Fulcher–Tamman behavior. Binary interaction parameters and VFT type fitting parameters were then used to determine the viscosity of pure and mixtures of ionic liquids with different combinations of cations and anions to ensure the validity of the prediction method. Consequently, the viscosities of binary ionic liquid mixtures were then calculated by using this prediction method. In this work, the viscosity data of pure ionic liquids and of binary mixtures of ionic liquids are successfully calculated from 293.15 K to 363.15 K at 0.1 MPa. All calculated viscosity data showed excellent agreement with experimental data with a relative absolute average deviation lower than 1.7%.