959 resultados para WORK ANALYSIS
Resumo:
A rotating beam finite element in which the interpolating shape functions are obtained by satisfying the governing static homogenous differential equation of Euler–Bernoulli rotating beams is developed in this work. The shape functions turn out to be rational functions which also depend on rotation speed and element position along the beam and account for the centrifugal stiffening effect. These rational functions yield the Hermite cubic when rotation speed becomes zero. The new element is applied for static and dynamic analysis of rotating beams. In the static case, a cantilever beam having a tip load is considered, with a radially varying axial force. It is found that this new element gives a very good approximation of the tip deflection to the analytical series solution value, as compared to the classical finite element given by the Hermite cubic shape functions. In the dynamic analysis, the new element is applied for uniform, and tapered rotating beams with cantilever and hinged boundary conditions to determine the natural frequencies, and the results compare very well with the published results given in the literature.
Resumo:
Cool roof coatings have a beneficial impact on reducing the heat load of a range of building types, resulting in reduced cooling energy loads. This study seeks to understand the extent to which cool roof coatings could be used as a residential demand side management (DSM) strategy for retrofitting existing housing in a constrained network area in tropical Australia where peak electrical demand is heavily influenced by residential cooling loads. In particular this study seeks to determine whether simulation software used for building regulation purposes can provide networks with the ‘impact certainty’ required by their DSM principles. The building simulation method is supported by a field experiment. Both numerical and experimental data confirm reductions in total consumption (kWh) and energy demand (kW). The nature of the regulated simulation software, combined with the diverse nature of residential buildings and their patterns of occupancy, however, mean that simulated results cannot be extrapolated to quantify benefits to a broader distribution network. The study suggests that building data gained from regulatory simulations could be a useful guide for potential impacts of widespread application of cool roof coatings in this region. The practical realization of these positive impacts, however, would require changes to the current business model for the evaluation of DSM strategies. The study provides seven key recommendations that encourage distribution networks to think beyond their infrastructure boundaries, recognising that the broader energy system also includes buildings, appliances and people.
Resumo:
Since the 2000s activewear has grown as a fashion category, and the tropes of gym wear – leggings, leotards and block colours – have become fashionable attire for both men and women outside the gym. This article examines the rise of activewear in the context of an on-going dialogue between fashion and sport since the beginning of the twentieth century. Through an analysis of the Australian activewear label, Lorna Jane, we consider the fashionable female body as both the object and subject of a consumer culture that increasingly overlays leisure with fashion. Activewear can be seen as the embodiment of an active and fashionable lifestyle that is achieved through a regime of self-discipline, and that symbolizes the pleasure in attaining and displaying the healthy and fit body.
Resumo:
This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.
Resumo:
This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.
Resumo:
Analysis of the genomic sequences of Escherichia coli and Salmonella typhimurium has revealed the presence of several homologues of the well studied citrate synthase (CS). One of these homologues has been shown to code for 2-methylcitrate synthase (2-MCS) activity. 2-MCS catalyzes one of the steps in the 2-methylcitric acid cycle found in these organisms for the degradation of propionate to pyruvate and succinate. In the present work, the gene coding for 2-MCS from S. typhimurium (StPrpC) was cloned in pRSET-C vector and overexpressed in E. coli. The protein was purified to homogeneity using Ni-NTA affinity chromatography. The purified protein was crystallized using the microbatch-under-oil method. The StPrpC crystals diffracted X-rays to 2.4 A resolution and belonged to the triclinic space group P1, with unit-cell parameters a = 92.068, b = 118.159, c = 120.659 A, alpha = 60.84, beta = 67.77, gamma = 81.92 degrees. Computation of rotation functions using the X-ray diffraction data shows that the protein is likely to be a decamer of identical subunits, unlike CSs, which are dimers or hexamers.
Resumo:
- Background Sonography is an important diagnostic tool in children with suspected appendicitis. Reported accuracy and appendiceal visualisation rates vary significantly, as does the management of equivocal ultrasound findings. The aim of this study was to audit appendiceal sonography at a tertiary children's hospital, and provide baseline data for a future prospective study. - Summary of work Records of children who underwent ultrasound studies for possible appendicitis between January 2008 and December 2010 were reviewed. Variables included patient demographics, sonographic appendix characteristics, and secondary signs. Descriptive statistics and analysis using ANOVA, Mann-Whitney U test, and ROC curves were performed. Mater Human Research Ethic Committee approval was granted. - Summary of results There were 457 eligible children. Using a dichotomous diagnostic model (including equivocal results), sensitivity was 89.6%, specificity 91.6%, and diagnostic yield of 40.7%. ROC curve analysis of a 6mm diameter cut-off was 0.88 AUC (95% CI 0.80 to 0.95). - Discussion and conclusions Sonography is an accurate test for acute appendicitis in children, with a high sensitivity and negative predictive value. A diameter of 6mm as an absolute cut-off in a binary model can lead to false findings. Results were compared with available literature. Recent publications propose categorising diameter1 and integrating secondary signs2 to improve accuracy and provide more meaningful results to clinicians. This study will be a benchmark for future studies with multiple diagnostic categorisation.
Resumo:
The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.
Resumo:
Aims: Develop and validate tools to estimate residual noise covariance in Planck frequency maps. Quantify signal error effects and compare different techniques to produce low-resolution maps. Methods: We derive analytical estimates of covariance of the residual noise contained in low-resolution maps produced using a number of map-making approaches. We test these analytical predictions using Monte Carlo simulations and their impact on angular power spectrum estimation. We use simulations to quantify the level of signal errors incurred in different resolution downgrading schemes considered in this work. Results: We find an excellent agreement between the optimal residual noise covariance matrices and Monte Carlo noise maps. For destriping map-makers, the extent of agreement is dictated by the knee frequency of the correlated noise component and the chosen baseline offset length. The significance of signal striping is shown to be insignificant when properly dealt with. In map resolution downgrading, we find that a carefully selected window function is required to reduce aliasing to the sub-percent level at multipoles, ell > 2Nside, where Nside is the HEALPix resolution parameter. We show that sufficient characterization of the residual noise is unavoidable if one is to draw reliable contraints on large scale anisotropy. Conclusions: We have described how to compute the low-resolution maps, with a controlled sky signal level, and a reliable estimate of covariance of the residual noise. We have also presented a method to smooth the residual noise covariance matrices to describe the noise correlations in smoothed, bandwidth limited maps.
Resumo:
Seepage through a sand bed affects the channel hydrodynamics, which in turn alters channel stability. Thus, the effect of seepage on its hydrodynamic parameters needs to be ascertained. The present work analyses spatially varied flow of a sand-bed channel subjected to seepage in the downward direction through a sand bed. Numerically calculated flow profiles affected by seepage have been verified using experimental observations. The present work also analyses the friction slope, velocity and bed shear stress variations along the channel for both seepage and no-seepage conditions. It was found that the downward seepage-induced channel flow has larger friction slope and bed shear stress than that of no-seepage.
Resumo:
The purpose of this study is to examine how transformation is defining feminist bioethics and to determine the nature of this transformation. Behind the quest for transformation is core feminism and its political implications, namely, that women and other marginalized groups have been given unequal consideration in society and the sciences and that this situation is unacceptable and should be remedied. The goal of the dissertation is to determine how feminist bioethicists integrate the transformation into their respective fields and how they apply the potential of feminism to bioethical theories and practice. On a theoretical level, feminist bioethicists wish to reveal how current ways of knowing are based on inequality. Feminists pay special attention especially to communal and political contexts and to the power relations endorsed by each community. In addition, feminist bioethicists endorse relational ethics, a relational account of the self in which the interconnectedness of persons is important. On the conceptual level, feminist bioethicists work with beliefs, concepts, and practices that give us our world. As an example, I examine how feminist bioethicists have criticized and redefined the concept of autonomy. Feminist bioethicists emphasize relational autonomy, which is based on the conviction that social relationships shape moral identities and values. On the practical level, I discuss stem cell research as a test case for feminist bioethics and its ability to employ its methodologies. Analyzing these perspectives allowed me first, to compare non-feminist and feminist accounts of stem cell ethics and, second, to analyze feminist perspectives on the novel biotechnology. Along with offering a critical evaluation of the stem cell debate, the study shows that sustainable stem cell policies should be grounded on empirical knowledge about how donors perceive stem cell research and the donation process. The study indicates that feminist bioethics should develop the use of empirical bioethics, which takes the nature of ethics seriously: ethical decisions are provisional and open for further consideration. In addition, the study shows that there is another area of development in feminist bioethics: the understanding of (moral) agency. I argue that agency should be understood to mean that actions create desires.
Resumo:
An epicyclic gear-train system with a speed step-up of 1:10, useful for numerical control work, is presented. Also, the analysis of such a system is carried out using flowgraph techniques.
Resumo:
There is a lack of integrative conceptual models that would help to better understand the underlying reasons for the alleged problems of MBA education. To address this challenge, we draw on the work of Pierre Bourdieu to examine MBA education as an activity with its own ‘economy of exchange’ and ‘rules of the game.’ We argue that application of Bourdieu’s theoretical ideas elucidates three key issues in debate around MBA education: the outcomes of MBA programs, the inculcation of potentially problematic values and practices through the programs, and the potential of self-regulation such as accreditation and ranking for impeding development of MBA education. First, Bourdieu’s notions of capital – intellectual, social and symbolic – shed light on the ‘economy of exchange’ in MBA education. Critics of MBA programs have pointed out that the value of MBA degrees lies not only in ‘learning.’ Bourdieu’s framework allows further analysis of this issue by distinguishing between intellectual (learning), social (social networks), and symbolic capital (credentials and prestige). Second, the concept of ‘habitus’ suggests how values and practices are inculcated through MBA education. This process is often a ‘voluntary’ one where problematic or ethically questionable ideas may be regarded as natural. Third, Bourdieu’s reflections on the ‘doxa’ and its reproduction and legitimation illuminate the role of accreditation and ranking in MBA education. An analysis of such self-regulation explains in part how the system may turn out impeding change.
Resumo:
This study explores the relationship between Intellectual Capital and Maintenance of Work Ability. Intellectual Capital is the central framework for analysing the increasing knowledge-intensiveness of business life. It is characteristic of Intellectual Capital that the intersection of human capital, internal structures and external structures is essential. Maintenance of Work Ability, on the other hand, has been the leading paradigm for Finnish occupational health and safety activities since the late 1980s. It is also a holistic approach that emphasises the interdependence of competence, work community, work environment and health as the key to work-related wellbeing. This thesis consists of five essays that scrutinise the focal phenomena both theoretically and empirically. The conceptual model that results from the first research essay provides a general framework for the whole thesis. The case study in the second essay supports a division of intangible assets into generative and commercially exploitable intangibles introduced in the first essay and further into the primary and secondary dimension of generative intangibles. Further scrutiny of the interaction of generative intangible assets in essay three reveals that employees’ wellbeing enhances the readiness to contribute to the knowledge creation process. The fourth essay shows that the MWA framework could benefit knowledge-intensive work but this would require a different approach than has been commonly adopted in Finland. In essay five, deeper analysis of the MWA framework shows that its potential results from comprehensive support of the functioning of an organisation. The general conclusion of this thesis is that organisations must take care of their employees’ wellbeing in order to secure innovativeness that is the key to surviving in today’s competitive business environment.
Resumo:
Extraction of text areas from the document images with complex content and layout is one of the challenging tasks. Few texture based techniques have already been proposed for extraction of such text blocks. Most of such techniques are greedy for computation time and hence are far from being realizable for real time implementation. In this work, we propose a modification to two of the existing texture based techniques to reduce the computation. This is accomplished with Harris corner detectors. The efficiency of these two textures based algorithms, one based on Gabor filters and other on log-polar wavelet signature, are compared. A combination of Gabor feature based texture classification performed on a smaller set of Harris corner detected points is observed to deliver the accuracy and efficiency.