927 resultados para seed retention time


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The broad objective of the study was to better understand anxiety among adolescents in Kolkata city, India. Specifically, the study compared anxiety across gender, school type, socio-economic background and mothers’ employment status. The study also examined adolescents’ perceptions of quality time with their parents. A group of 460 adolescents (220 boys and 240 girls), aged 13-17 years were recruited to participate in the study via a multi-stage sampling technique. The data were collected using a self-report semi-structured questionnaire and a standardized psychological test, the State-Trait Anxiety Inventory. Results show that anxiety was prevalent in the sample with 20.1% of boys and 17.9% of girls found to be suffering from high anxiety. More boys were anxious than girls (p<0.01). Adolescents from Bengali medium schools were more anxious than adolescents from English medium schools (p<0.01). Adolescents belonging to the middle class (middle socio-economic group) suffered more anxiety than those from both high and low socio-economic groups (p<0.01). Adolescents with working mothers were found to be more anxious (p<0.01). Results also show that a substantial proportion of the adolescents perceived they did not receive quality time from fathers (32.1%) and mothers (21.3%). A large number of them also did not feel comfortable to share their personal issues with their parents (60.0% for fathers and 40.0% for mothers).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Lockyer Valley, southeast Queensland, hosts intensive irrigated agriculture using groundwater from over 5000 alluvial bores. A current project is considering introduction of PRW (purified recycled water) to augment groundwater supplies. To assess this, a valley-wide MODFLOW simulation model is being developed plus a new unsaturated zone flow model. To underpin these models and provide a realistic understanding of the aquifer framework a 3D visualisation model has been developed using Groundwater Visualisation System (GVS) software produced at QUT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Art is most often at the margins of community life, seen as a distraction or entertainment only; an individual’s whim. It is generally seen as without a useful role to play in that community. This is a perception of grown-ups; children seem readily to accept an engagement with art making. Our research has shown that when an individual is drawn into a crafted art project where they have an actual involvement with the direction and production of the art work, then they become deeply engaged on multiple levels. This is true of all age groups. Artists skilled in community collaboration are able to produce art of value that transcends the usual judgements of worth. It gives people a licence to unfetter their imagination and then cooperatively be drawn back to a reachable visual solution. If you engage with children in a community, you engage the extended family at some point. The primary methodology was to produce a series of educationally valid projects at the Cherbourg State School that had a resonance into that community, then revisit and refine them where necessary and develop a new series that extended all of the positive aspects of them. This was done over a period of five years. The art made during this time is excellent. The children know it, as do their families, staff at the school, members of the local community and the others who have viewed it in exhibitions in far places like Brisbane and Melbourne. This art and the way it has been made has been acknowledged as useful by the children, teachers and the community, in educational and social terms. The school is a better place to be. This has been acknowledged by the children, teachers and the community The art making of the last five years has become an integral part of the way the school now operates and the influence of that has begun to seep into other parts of the community. Art needs to be taken from the margins and put to work at the centre.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial representations, metaphors and imaginaries (cyberspace, web pages) have been the mainstay of internet research for a long time. Instead of repeating these themes, this paper seeks to answer the question of how we might understand the concept of time in relation to internet research. After a brief excursus on the general history of the concept, this paper proposes three different approaches to the conceptualisation of internet time. The common thread underlying all the approaches is the notion of time as an assemblage of elements such as technical artefacts, social relations and metaphors. By drawing out time in this way, the paper addresses the challenge of thinking of internet time as coexistence, a clash of fluxes, metaphors, lived experiences and assemblages. In other words, this paper proposes a way to articulate internet time as a multiplicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.