884 resultados para Non-compliance situations
Resumo:
Here mixed convection boundary layer flow of a viscous fluid along a heated vertical semi-infinite plate is investigated in a non-absorbing medium. The relationship between convection and thermal radiation is established via boundary condition of second kind on the thermally radiating vertical surface. The governing boundary layer equations are transformed into dimensionless parabolic partial differential equations with the help of appropriate transformations and the resultant system is solved numerically by applying straightforward finite difference method along with Gaussian elimination technique. It is worthy to note that Prandlt number, Pr, is taken to be small (<< 1) which is appropriate for liquid metals. Moreover, the numerical results are demonstrated graphically by showing the effects of important physical parameters, namely, the modified Richardson number (or mixed convection parameter), Ri*, and surface radiation parameter, R, in terms of local skin friction and local Nusselt number coefficients.
Resumo:
Numerical investigation on mixed convection of a two-dimensional incompressible laminar flow over a horizontal flat plate with streamwise sinusoidal distribution of surface temperature has been performed for different values of Rayleigh number, Reynolds number and frequency of periodic temperature for constant Prandtl number and amplitude of periodic temperature. Finite element method adapted to rectangular non-uniform mesh elements by a non-linear parametric solution algorithm basis numerical scheme has been employed. The investigating parameters are the Rayleigh number, the Reynolds number and frequency of periodic temperature. The effect of variation of individual investigating parameters on mixed convection flow characteristics has been studied to observe the hydrodynamic and thermal behavior for while keeping the other parameters constant. The fluid considered in this study is air with Prandtl number 0.72. The results are obtained for the Rayleigh number range of 102 to 104, Reynolds number ranging from 1 to 100 and the frequency of periodic temperature from 1 to 5. Isotherms, streamlines, average and local Nusselt numbers are presented to show the effect of the different values of aforementioned investigating parameters on fluid flow and heat transfer.
Resumo:
The unsteady boundary-layer development for thermomagnetic convection of paramagnetic fluids inside a square cavity has been considered in this study. The cavity is placed in a microgravity condition (no gravitation acceleration) and under a uniform magnetic field which acts vertically. A ramp temperature boundary condition is applied on left vertical side wall of the cavity where the temperature initially increases with time up to some specific time and maintain constant thereafter. A distinct magnetic convection boundary layer is developed adjacent to the left vertical wall due to the effect of the magnetic body force generated on the paramagnetic fluid. An improved scaling analysis has been performed using triple-layer integral method and verified by numerical simulations. The Prandtl number has been chosen greater than unity varied over 5-100. Moreover, the effect of various values of the magnetic parameter and magnetic Rayleigh number on the fluid flow and heat transfer has been shown.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.
Resumo:
In present work, numerical solution is performed to study the confined flow of power-law non Newtonian fluids over a rotating cylinder. The main purpose is to evaluate drag and thermal coefficients as functions of the related governing dimensionless parameters, namely, power-law index (0.5 ≤ n ≤ 1.4), dimensionless rotational velocity (0 ≤ α ≤ 6) and the Reynolds number (100 ≤ Re ≤ 500). Over the range of Reynolds number, the flow is known to be steady. Results denoted that the increment of power law index and rotational velocity increases the drag coefficient due to momentum diffusivity improvement which is responsible for low rate of heat transfer, because the thicker the boundary layer, the lower the heat transfer is implemented.
Resumo:
The effect of conduction-convection-radiation on natural convection flow of Newtonian optically thick gray fluid, confined in a non-Darcian porous media square cavity is numerically studied. For the gray fluid consideration is given to Rosseland diffusion approximation. Further assuming that (i) the temperature of the left vertical wall is varying linearly with height, (ii) cooled right vertical and top walls and (iii) the bottom wall is uniformly-heated. The governing equations are solved using the Alternate Direct Implicit method together with the Successive Over Relaxation technique. The investigation of the effect of governing parameters namely the Forschheimer resistance (Γ), the Planck constant (Rd), and the temperature difference (Δ), on flow pattern and heat transfer characteristics has been carried out. It was seen that the reduction of flow and heat transfer occurs as the Forschheimer resistance is increased. On the other hand both the strength of flow and heat transfer increases as the temperature ratio, Δ, is increased.
Resumo:
Education in the 21st century demands a model for understanding a new culture of learning in the face of rapid change, open access data and geographical diversity. Teachers no longer need to provide the latest information because students themselves are taking an active role in peer collectives to help create it. This paper examines, through an Australian case study entitled ‘Design Minds’, the development of an online design education platform as a key initiative to enact a government priority for state-wide cultural change through design-based curriculum. Utilising digital technology to create a supportive community, ‘Design Minds’ recognises that interdisciplinary learning fostered through engagement will empower future citizens to think, innovate, and discover. This paper details the participatory design process undertaken with multiple stakeholders to create the platform. It also outlines a proposed research agenda for future measurement of its value in creating a new learning culture, supporting regional and remote communities, and revitalising frontline services. It is anticipated this research will inform ongoing development of the online platform, and future design education and research programs in K-12 schools in Australia.
Resumo:
Background: Ultraviolet radiation exposure during an individuals' lifetime is a known risk factor for the development of skin cancer. However, less evidence is available on assessing the relationship between lifetime sun exposure and skin damage and skin aging. Objectives: This study aims to assess the relationship between lifetime sun exposure and skin damage and skin aging using a non-invasive measure of exposure. Methods: We recruited 180 participants (73 males, 107 females) aged 18-83 years. Digital imaging of skin hyper-pigmentation (skin damage) and skin wrinkling (skin aging) on the facial region was measured. Lifetime sun exposure (presented as hours) was calculated from the participants' age multiplied by the estimated annual time outdoors for each year of life. We analyzed the effects of lifetime sun exposure on skin damage and skin aging. We adjust for the influence of age, sex, occupation, history of skin cancer, eye color, hair color, and skin color. Results: There were non-linear relationships between lifetime sun exposure and skin damage and skin aging. Younger participant's skin is much more sensitive to sun exposure than those who were over 50 years of age. As such, there were negative interactions between lifetime sun exposure and age. Age had linear effects on skin damage and skin aging. Conclusion: The data presented showed that self reported lifetime sun exposure was positively associated with skin damage and skin aging, in particular, the younger people. Future health promotion for sun exposure needs to pay attention to this group for skin cancer prevention messaging. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This paper reports on a four year Australian Research Council funded Linkage Project titled Skilling Indigenous Queensland, conducted in regional areas of Queensland, Australia from 2009 to 2013. The project sought to investigate vocational education, training (VET) and teaching, Indigenous learners’ needs, employer cultural and expectations and community culture and expectations to identify best practice in numeracy teaching for Indigenous VET learners. Specifically it focused on ways to enhance the teaching and learning of courses and the associated mathematics in such courses to benefit learners and increase their future opportunities of employment. To date thirty-nine teachers/trainers/teacher aides and two hundred and thirty-one students consented to participate in the project. Nine VET courses were nominated to be the focus on the study. This paper focuses on questionnaire and interview responses from four trainers, two teacher aides and six students. In recent years a considerable amount of funding has been allocated to increasing Indigenous Peoples’ participation in education and employment. This increased funding is predicated on the assumption that it will make a difference and contribute to closing the education gap between Indigenous and non-Indigenous Australians (Council of Australia Governments, 2009). The central tenet is that access to education for Indigenous People will create substantial social and economic benefits for regional and remote Indigenous People. The project’s aim is to address some of the issues associated with the gap. To achieve the aims, the project adopted a mixed methods design aimed at benefitting research participants and included: participatory collaborative action research (Kemmis & McTaggart, 1988) and, community research (Smith, 1999). Participatory collaborative action research refers to a is a “collective, self-reflective enquiry undertaken by participants in social situations in order to improve the rationality and justice of their own social and educational practices” (Kemmis et al., 1988, p. 5). Community research is described as an approach that “conveys a much more intimate, human and self-defined space” (p. 127). Community research relies on and validates the community’s own definitions. As the project is informed by the social at a community level, it is described as “community action research or emancipatory research” (Smith, 1999, p. 127). It seeks to demonstrate benefit to the community, making positive differences in the lives of Indigenous People and communities. The data collection techniques included survey questionnaires, video recording of teaching and learning processes, teacher reflective video analysis of teaching, observations, semi-structured interviews and student numeracy testing. As a result of these processes, the findings indicate that VET course teachers work hard to adopt contextualising strategies to their teaching, however this process is not always straight forward because of the perceptions of how mathematics has been taught and learned historically. Further teachers, trainers and students have high expectations of one another with the view to successful outcomes from the courses.
Resumo:
The international climate change regime has the potential to increase revenue available for forest restoration projects in Commonwealth nations. There are three mechanisms which could be used to fund forest projects aimed at forest conservation, forest restoration and sustainable forest management. The first forest funding opportunity arises under the clean development mechanism, a flexibility mechanism of the Kyoto Protocol. The clean development mechanism allows Annex I parties (industrialised nations) to invest in emission reduction activities in non-Annex 1 (developing countries) and the establishment of forest sinks is an eligible clean development mechanism activity. Secondly, parties to the Kyoto Protocol are able to include sustainable forest management activities in their national carbon accounting. The international rules concerning this are called the Land-Use, Land-Use Change and Forestry Guidelines. Thirdly, it is anticipated that at the upcoming Copenhagen negotiations that a Reduced Emissions from Deforestation and Degradation (REDD) instrument will be created. This will provide a direct funding mechanism for those developing countries with tropical forests. Payments made under a REDD arrangement will be based upon the developing country with tropical forest cover agreeing to protect and conserve a designated forest estate. These three funding options available under the international climate change regime demonstrate that there is potential for forest finance within the regime. These opportunities are however hindered by a number of technical and policy barriers which prevent the ability of the regime to significantly increase funding for forest projects. There are two types of carbon markets, compliance carbon markets (Kyoto based) and voluntary carbon markets. Voluntary carbon markets are more flexible then compliance markets and as such offer potential to increase revenue available for sustainable forest projects.
Resumo:
Average speed enforcement is a relatively new approach gaining popularity throughout Europe and Australia. This paper reviews the evidence regarding the impact of this approach on vehicle speeds, crashes rates and a number of additional road safety and public health outcomes. The economic and practical viability of the approach as a road safety countermeasure is also explored. A literature review, with an international scope, of both published and grey literature was conducted. There is a growing body of evidence to suggest a number of road safety benefits associated with average speed enforcement, including high rates of compliance with speed limits, reductions in average and 85th percentile speeds and reduced speed variability between vehicles. Moreover, the approach has been demonstrated to be particularly effective in reducing excessive speeding behaviour. Reductions in crash rates have also been reported in association with average speed enforcement, particularly in relation to fatal and serious injury crashes. In addition, the approach has been shown to improve traffic flow, reduce vehicle emissions and has also been associated with high levels of public acceptance. Average speed enforcement offers a greater network-wide approach to managing speeds that reduces the impact of time and distance halo effects associated with other automated speed enforcement approaches. Although comparatively expensive it represents a highly reliable approach to speed enforcement that produces considerable returns on investment through reduced social and economic costs associated with crashes.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
This paper examines the relationship between financial performance and ethical screening intensity of a special class of ethical funds that is rooted in Islamic values – Islamic equity funds (IEFs). These faith-based ethical funds screen investments on compliance with Islamic values where conventional interest expense (riba), gambling (maysir), excessive uncertainty (gharar), and non-ethical (non-halal) products are prohibited. We test whether these extra screens affect the financial performance of IEFs relative to non-Islamic funds. Based on a large survivorship-free international sample of 387 Islamic funds, our results show that IEFs on average underperform conventional funds by 40 basis points per month, or 4.8% per year (supporting the underperformance hypothesis). While Islamic funds do not generally perform better during crisis periods, they outperformed conventional funds during the recent sub-prime crisis (supporting the outperformance hypothesis). Using holdings-based measures for ethical screening intensity, results show IEFs that apply more intensive screening perform worse, suggesting that there is a cost to being ethical.
Resumo:
Past work has clearly demonstrated that numerous commonly used metallic materials will support burning in oxygen, especially at higher pressures. An approach to rectify this significant safety problem has been successfully developed and implemented by applying the concept of Situational Non-Flammability. This approach essentially removes or breaks one leg of the conceptual fire triangle, a tool commonly used to define the three things that are required to support burning; a fuel, an ignition source and an oxidizer. Since an oxidiser is always present in an oxygen system as are ignition sources, the concept of Situational Non-Flammability essentially removes the fuel leg of the fire triangle by only utilising materials that will not burn at the maximum pressure, for example, that the control valve is to be used in. The utilisation of this approach has lead to the development of a range of oxygen components that are practically unable to burn while in service at their design pressure thus providing an unparalleled level of first safety while not compromising on the performance or endurance required in the function of these components. This paper describes the concept of Situational Non-Flammability, how it was used to theoretically evaluate designs of components for oxygen service and the outcomes of the actual development, fabrication and finally utilisation of these components in real oxygen systems in a range of flow control devices.