990 resultados para Machinability Assessment
Resumo:
A methodology for reliability based optimum design of reinforced soil structures subjected to horizontal and vertical sinusoidal excitation based on pseudo-dynamic approach is presented. The tensile strength of reinforcement required to maintain the stability is computed using logarithmic spiral failure mechanism. The backfill soil properties, geometric and strength properties of reinforcement are treated as random variables. Effects of parameters like soil friction angle, horizontal and vertical seismic accelerations, shear and primary wave velocities, amplification factors for seismic acceleration on the component and system probability of failures in relation to tension and pullout capacities of reinforcement have been discussed. In order to evaluate the validity of the present formulation, static and seismic reinforcement force coefficients computed by the present method are compared with those given by other authors. The importance of the shear wave velocity in the estimation of the reliability of the structure is highlighted. The Ditlevsen's bounds of system probability of failure are also computed by taking into account the correlations between three failure modes, which is evaluated using the direction cosines of the tangent planes at the most probable points of failure. (c) 2009 Elsevier Ltd. All rights reserved.
Resumo:
While the majority of violent threats – defined as an expression of intent to do harm or act out violently against someone or something – do not progress to actual violence, a small proportion of threateners do go on to enact violence. Most researchers argue that violence risk assessments are inadequate for assessing threats of violence, which raises the question: how should a threat assessment (TA) be conducted? To begin to understand available frameworks for assessing threats, a systematic review of TA research literature was conducted. Most TA literature pertains to a specific domain (schools, public figure threats, workplaces) and target audience (clinicians, school personnel, law enforcement). TA guidelines are typically based on literature reviews with some based on empirical measures and others having no strong evidential basis. The most common concepts in TA are exploration of the threatener's mental health, the motivation for the threat and the presence of any plans. Rather than advocating for the development of a protocol for conducting TA, this article outlines the common areas of inquiry in assessing threats and highlights the limitations of current TA guidelines.
Resumo:
The aim of this study was to estimate the development of fertility in North-Central Namibia, former Ovamboland, from 1960 to 2001. Special attention was given to the onset of fertility decline and to the impact of the HIV epidemic on fertility. An additional aim was to introduce parish registers as a source of data for fertility research in Africa. Data used consisted of parish registers from Evangelical Lutheran congregations, the 1991 and 2001 Population and Housing Censuses, the 1992 and 2000 Namibia Demographic and Health Surveys, and the HIV sentinel surveillances of 1992-2004. Both period and cohort fertility were analysed. The P/F ratio method was used when analysing census data. The impact of HIV infection on fertility was estimated indirectly by comparing the fertility histories of women who died at an age of less than 50 years with the fertility of other women. The impact of the HIV epidemic on fertility was assessed both among infected women and in the general population. Fertility in the study population began to decline in 1980. The decline was rapid during the 1980s, levelled off in the early 1990s at the end of war of independence and then continued to decline until the end of the study period. According to parish registers, total fertility was 6.4 in the 1960s and 6.5 in the 1970s, and declined to 5.1 in the 1980s and 4.2 in the 1990s. Adjustment of these total fertility rates to correspond to levels of fertility based on data from the 1991 and 2001 censuses resulted in total fertility declining from 7.6 in 1960-79 to 6.0 in 1980-89, and to 4.9 in 1990-99. The decline was associated with increased age at first marriage, declining marital fertility and increasing premarital fertility. Fertility among adolescents increased, whereas the fertility of women in all other age groups declined. During the 1980s, the war of independence contributed to declining fertility through spousal separation and delayed marriages. Contraception has been employed in the study region since the 1980s, but in the early 1990s, use of contraceptives was still so limited that fertility was higher in North-Central Namibia than in other regions of the country. In the 1990s, fertility decline was largely a result of the increased prevalence of contraception. HIV prevalence among pregnant women increased from 4% in 1992 to 25% in 2001. In 2001, total fertility among HIV-infected women (3.7) was lower than that among other women (4.8), resulting in total fertility of 4.4 among the general population in 2001. The HIV epidemic explained more than a quarter of the decline in total fertility at population level during most of the 1990s. The HIV epidemic also reduced the number of children born by reducing the number of potential mothers. In the future, HIV will have an extensive influence on both the size and age structure of the Namibian population. Although HIV influences demographic development through both fertility and mortality, the effect through changes in fertility will be smaller than the effect through mortality. In the study region, as in some other regions of southern Africa, a new type of demographic transition is under way, one in which population growth stagnates or even reverses because of the combined effects of declining fertility and increasing mortality, both of which are consequences of the HIV pandemic.
Resumo:
The cultural appropriateness of human service processes is a major factor in determining the effectiveness of their delivery. Sensitivity to issues of culture is particularly critical in dealing with family disputes, which are generally highly emotive and require difficult decisions to be made regarding children, material assets and ongoing relationships. In this article we draw on findings from an evaluation of the Family Relationship Centre at Broadmeadows (FRCB) to offer some insights into and suggestions about managing cultural matters in the current practice of family dispute resolution (FDR) in Australia. The brief for the original research was to evaluate the cultural appropriateness of FDR services offered to culturally and linguistically diverse (CALD) communities living within the FRCB’s catchment area, specifically members of the Lebanese, Turkish and Iraqi communities. The conclusions of the evaluations were substantially positive. The work of the Centre was found to illustrate many aspects of best practice but also raised questions worthy of future exploration. The current article reports on issues of access, retention and outcomes obtained by CALD clients at various stages of the FRCB service.
Resumo:
The cultural appropriateness of human service processes is a major factor in determining the effectiveness of their delivery. Sensitivity to issues of culture is particularly critical in dealing with family disputes, which are generally highly emotive and require difficult decisions to be made regarding children, material assets and ongoing relationships. In this article we draw on findings from an evaluation of the Family Relationship Centre at Broadmeadows (FRCB) to offer some insights into and suggestions about managing cultural matters in the current practice of family dispute resolution (FDR) in Australia. The brief for the original research was to evaluate the cultural appropriateness of FDR services offered to culturally and linguistically diverse (CALD) communities living within the FRCB’s catchment area, specifically members of the Lebanese, Turkish and Iraqi communities. The conclusions of the evaluations were substantially positive. The work of the Centre was found to illustrate many aspects of best practice but also raised questions worthy of future exploration. The current article reports on overall cultural appropriateness, particularly identifying barriers which may inhibit access and how acculturation may play a role in reducing perception of barriers. An earlier article reported on access, retention and outcomes for these CALD groups (Akin Ojelabi et al., 2011).
Resumo:
This paper describes a concept for a collision avoidance system for ships, which is based on model predictive control. A finite set of alternative control behaviors are generated by varying two parameters: offsets to the guidance course angle commanded to the autopilot and changes to the propulsion command ranging from nominal speed to full reverse. Using simulated predictions of the trajectories of the obstacles and ship, compliance with the Convention on the International Regulations for Preventing Collisions at Sea and collision hazards associated with each of the alternative control behaviors are evaluated on a finite prediction horizon, and the optimal control behavior is selected. Robustness to sensing error, predicted obstacle behavior, and environmental conditions can be ensured by evaluating multiple scenarios for each control behavior. The method is conceptually and computationally simple and yet quite versatile as it can account for the dynamics of the ship, the dynamics of the steering and propulsion system, forces due to wind and ocean current, and any number of obstacles. Simulations show that the method is effective and can manage complex scenarios with multiple dynamic obstacles and uncertainty associated with sensors and predictions.
Resumo:
The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.
Resumo:
There is on-going international interest in the relationships between assessment instruments, students’ understanding of science concepts and context-based curriculum approaches. This study extends earlier research showing that students can develop connections between contexts and concepts – called fluid transitions – when studying context-based courses. We provide an in-depth investigation of one student’s experiences with multiple contextual assessment instruments that were associated with a context-based course. We analyzed the student’s responses to context-based assessment instruments to determine the extent to which contextual tests, reports of field investigations, and extended experimental investigations afforded her opportunities to make connections between contexts and concepts. A system of categorizing student responses was developed that can inform other educators when analyzing student responses to contextual assessment. We also refine the theoretical construct of fluid transitions that informed the study initially. Implications for curriculum and assessment design are provided in light of the findings.
Resumo:
The resources of health systems are limited. There is a need for information concerning the performance of the health system for the purposes of decision-making. This study is about utilization of administrative registers in the context of health system performance evaluation. In order to address this issue, a multidisciplinary methodological framework for register-based data analysis is defined. Because the fixed structure of register-based data indirectly determines constraints on the theoretical constructs, it is essential to elaborate the whole analytic process with respect to the data. The fundamental methodological concepts and theories are synthesized into a data sensitive approach which helps to understand and overcome the problems that are likely to be encountered during a register-based data analyzing process. A pragmatically useful health system performance monitoring should produce valid information about the volume of the problems, about the use of services and about the effectiveness of provided services. A conceptual model for hip fracture performance assessment is constructed and the validity of Finnish registers as a data source for the purposes of performance assessment of hip fracture treatment is confirmed. Solutions to several pragmatic problems related to the development of a register-based hip fracture incidence surveillance system are proposed. The monitoring of effectiveness of treatment is shown to be possible in terms of care episodes. Finally, an example on the justification of a more detailed performance indicator to be used in the profiling of providers is given. In conclusion, it is possible to produce useful and valid information on health system performance by using Finnish register-based data. However, that seems to be far more complicated than is typically assumed. The perspectives given in this study introduce a necessary basis for further work and help in the routine implementation of a hip fracture monitoring system in Finland.
Resumo:
Impacts of climate change on hydrology are assessed by downscaling large scale general circulation model (GCM) outputs of climate variables to local scale hydrologic variables. This modelling approach is characterized by uncertainties resulting from the use of different models, different scenarios, etc. Modelling uncertainty in climate change impact assessment includes assigning weights to GCMs and scenarios, based on their performances, and providing weighted mean projection for the future. This projection is further used for water resources planning and adaptation to combat the adverse impacts of climate change. The present article summarizes the recent published work of the authors on uncertainty modelling and development of adaptation strategies to climate change for the Mahanadi river in India.
Resumo:
Nanotechnology is a new technology which is generating a lot of interest among academicians, practitioners and scientists. Critical research is being carried out in this area all over the world.Governments are creating policy initiatives to promote developments it the nanoscale science and technology developments. Private investment is also seeing a rising trend. Large number of academic institutions and national laboratories has set up research centers that are workingon the multiple applications of nanotechnology. Wide ranges of applications are claimed for nanotechnology. This consists of materials, chemicals, textiles, semiconductors, to wonder drug delivery systems and diagnostics. Nanotechnology is considered to be a next big wave of technology after information technology and biotechnology. In fact, nanotechnology holds the promise of advances that exceed those achieved in recent decades in computers and biotechnology. Much interest in nanotechnology also could be because of the fact that enormous monetary benefits are expected from nanotechnology based products. According to NSF, revenues from nanotechnology could touch $ 1 trillion by 2015. However much of the benefits are projected ones. Realizing claimed benefits require successful development of nanoscience andv nanotechnology research efforts. That is the journey of invention to innovation has to be completed. For this to happen the technology has to flow from laboratory to market. Nanoscience and nanotechnology research efforts have to come out in the form of new products, new processes, and new platforms.India has also started its Nanoscience and Nanotechnology development program in under its 10(th) Five Year Plan and funds worth Rs. One billion have been allocated for Nanoscience and Nanotechnology Research and Development. The aim of the paper is to assess Nanoscience and Nanotechnology initiatives in India. We propose a conceptual model derived from theresource based view of the innovation. We have developed a structured questionnaire to measure the constructs in the conceptual model. Responses have been collected from 115 scientists and engineers working in the field of Nanoscience and Nanotechnology. The responses have been analyzed further by using Principal Component Analysis, Cluster Analysis and Regression Analysis.