875 resultados para Use of time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Skipjack (Katsuwonus pelamis), yellowfin (Thunnus albacares), and bigeye (Thunnus obesus) tunas are caught by purse-seine vessels in the eastern Pacific Ocean (EPO). Although there is no evidence to indicate that current levels of fishing-induced mortality will affect the sustainability of skipjack or yellowfin tunas, fishing mortality on juvenile (younger than 5 years of age) bigeye tuna has increased, and overall fishing mortality is greater than that necessary to produce the maximum sustainable yield of this species. We investigated whether time-area closures have the potential to reduce purse-seine bigeye catches without significantly reducing skipjack catches. Using catch and effort data for 1995–2002, we identified regions where the ratio of bigeye to skipjack tuna catches was high and applied simple closed-area models to investigate the possible benefits of time-area closures. We estimated that the most optimistic and operationally feasible 3-month closures, covering the equatorial region of the EPO during the third quarter of the year, could reduce bigeye catches by 11.5%, while reducing skipjack tuna catches by 4.3%. Because this level of bigeye tuna catch reduction is insufficient to address sustainability concerns, and larger and longer closures would reduce catches of this species signficantly, we recommend that future research be directed toward gear technology solutions because these have been successful in many other fisheries. In particular, because over 50% of purse-seine catches of bigeye tuna are taken in sets in which bigeye tuna are the dominant species, methods to allow the determination of the species composition of aggregations around floating objects may be important.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Comprehensive undergraduate education in clinical sciences is grounded on activities developed during clerkships. To implement the credits system we must know how these experiences take place. Objectives: to describe how students spend time in clerkships, how they assess the educative value of activities and the enjoyment it provides. Method: We distributed a form to a random clustered sample of a 100 students coursing clinical sciences, designed to record the time spent, and to assess the educative value and the grade of enjoyment of the activities in clerkship during a week. Data were registered and analyzed on Excel® 98 and SPSS. Results: mean time spent by students in clerkship activities on a day were 10.8 hours. Of those, 7.3 hours (69%) were spent in formal education activities. Patient care activities with teachers occupied the major proportion of time (15.4%). Of the teaching and learning activities in a week, 28 hours (56%) were spent in patient care activities and 22.4 hours (44.5%) were used in independent academic work. The time spent in teaching and learning activities correspond to 19 credits of a semester of 18 weeks. The activities assessed as having the major educational value were homework activities (4.6) and formal education activities (4.5). The graded as most enjoyable were extracurricular activities, formal educational activities and independent academic work. Conclusion: our students spend more time in activities with patients than the reported in literature. The attending workload of our students is greater than the one reported in similar studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Compounds exhibiting antioxidant activity have received much interest in the food industry because of their potential health benefits. Carotenoids such as lycopene, which in the human diet mainly derives from tomatoes (Solanum lycopersicum), have attracted much attention in this aspect and the study of their extraction, processing and storage procedures is of importance. Optical techniques potentially offer advantageous non-invasive and specific methods to monitor them. Objectives To obtain both fluorescence and Raman information to ascertain if ultrasound assisted extraction from tomato pulp has a detrimental effect on lycopene. Method Use of time-resolved fluorescence spectroscopy to monitor carotenoids in a hexane extract obtained from tomato pulp with application of ultrasound treatment (583 kHz). The resultant spectra were a combination of scattering and fluorescence. Because of their different timescales, decay associated spectra could be used to separate fluorescence and Raman information. This simultaneous acquisition of two complementary techniques was coupled with a very high time-resolution fluorescence lifetime measurement of the lycopene. Results Spectroscopic data showed the presence of phytofluene and chlorophyll in addition to lycopene in the tomato extract. The time-resolved spectral measurement containing both fluorescence and Raman data, coupled with high resolution time-resolved measurements, where a lifetime of ~5 ps was attributed to lycopene, indicated lycopene appeared unaltered by ultrasound treatment. Detrimental changes were, however, observed in both chlorophyll and phytofluene contributions. Conclusion Extracted lycopene appeared unaffected by ultrasound treatment, while other constituents (chlorophyll and phytofluene) were degraded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In total, 782 Escherichia coli strains originating from various host sources have been analyzed in this study by using a highly discriminatory single-nucleotide polymorphism (SNP) approach. A set of eight SNPs, with a discrimination value (Simpson's index of diversity [D]) of 0.96, was determined using the Minimum SNPs software, based on sequences of housekeeping genes from the E. coli multilocus sequence typing (MLST) database. Allele-specific real-time PCR was used to screen 114 E. coli isolates from various fecal sources in Southeast Queensland (SEQ). The combined analysis of both the MLST database and SEQ E. coli isolates using eight high-D SNPs resolved the isolates into 74 SNP profiles. The data obtained suggest that SNP typing is a promising approach for the discrimination of host-specific groups and allows for the identification of human-specific E. coli in environmental samples. However, a more diverse E. coli collection is required to determine animal- and environment-specific E. coli SNP profiles due to the abundance of human E. coli strains (56%) in the MLST database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides fundamental understanding for the use of cumulative plots for travel time estimation on signalized urban networks. Analytical modeling is performed to generate cumulative plots based on the availability of data: a) Case-D, for detector data only; b) Case-DS, for detector data and signal timings; and c) Case-DSS, for detector data, signal timings and saturation flow rate. The empirical study and sensitivity analysis based on simulation experiments have observed the consistency in performance for Case-DS and Case-DSS, whereas, for Case-D the performance is inconsistent. Case-D is sensitive to detection interval and signal timings within the interval. When detection interval is integral multiple of signal cycle then it has low accuracy and low reliability. Whereas, for detection interval around 1.5 times signal cycle both accuracy and reliability are high.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is the eight deliverable of the Real Time and Predictive Traveller Information project and the third deliverable of the Arterial Travel Time Information sub-project in the Integrated Traveller Information research Domain of the Smart Transport Research Centre. The primary objective of the Arterial Travel Time Information sub-project is to develop algorithms for real-time travel time estimation and prediction models for arterial traffic. Brisbane arterial network is highly equipped with Bluetooth MAC Scanners, which can provide travel time information. Literature is limited with the knowledge on the Bluetooth protocol based data acquisition process and accuracy and reliability of the analysis performed using the data. This report expands the body of knowledge surrounding the use of data from Bluetooth MAC Scanner (BMS) as a complementary traffic data source. A multi layer simulation model named Traffic and Communication Simulation (TCS) is developed. TCS is utilised to model the theoretical properties of the BMS data and analyse the accuracy and reliability of travel time estimation using the BMS data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective of this study is to develop a robust queue estimation algorithm for motorway on-ramps. Real-time queue information is a vital input for dynamic queue management on metered on-ramps. Accurate and reliable queue information enables the management of on-ramp queue in an adaptive manner to the actual traffic queue size and thus minimises the adverse impacts of queue flush while increasing the benefit of ramp metering. The proposed algorithm is developed based on the Kalman filter framework. The fundamental conservation model is used to estimate the system state (queue size) with the flow-in and flow-out measurements. This projection results are updated with the measurement equation using the time occupancies from mid-link and link-entrance loop detectors. This study also proposes a novel single point correction method. This method resets the estimated system state to eliminate the counting errors that accumulate over time. In the performance evaluation, the proposed algorithm demonstrated accurate and reliable performances and consistently outperformed the benchmarked Single Occupancy Kalman filter (SOKF) method. The improvements over SOKF are 62% and 63% in average in terms of the estimation accuracy (MAE) and reliability (RMSE), respectively. The benefit of the innovative concepts of the algorithm is well justified by the improved estimation performance in congested ramp traffic conditions where long queues may significantly compromise the benchmark algorithm’s performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many countries conduct regular national time use surveys, some of which date back as far as the 1960s. Time use surveys potentially provide more detailed and accurate national estimates of the prevalence of sedentary and physical activity behavior than more traditional self-report surveillance systems. In this study, the authors determined the reliability and validity of time use surveys for assessing sedentary and physical activity behavior. In 2006 and 2007, participants (n = 134) were recruited from work sites in the Australian state of New South Wales. Participants completed a 2-day time use diary twice, 7 days apart, and wore an accelerometer. The 2 diaries were compared for test-retest reliability, and comparison with the accelerometer determined concurrent validity. Participants with similar activity patterns during the 2 diary periods showed reliability intraclass correlations of 0.74 and 0.73 for nonoccupational sedentary behavior and moderate/vigorous physical activity, respectively. Comparison of the diary with the accelerometer showed Spearman correlations of 0.57-0.59 and 0.45-0.69 for nonoccupational sedentary behavior and moderate/vigorous physical activity, respectively. Time use surveys appear to be more valid for population surveillance of nonoccupational sedentary behavior and health-enhancing physical activity than more traditional surveillance systems. National time use surveys could be used to retrospectively study nonoccupational sedentary and physical activity behavior over the past 5 decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The correlation dimension D 2 and correlation entropy K 2 are both important quantifiers in nonlinear time series analysis. However, use of D 2 has been more common compared to K 2 as a discriminating measure. One reason for this is that D 2 is a static measure and can be easily evaluated from a time series. However, in many cases, especially those involving coloured noise, K 2 is regarded as a more useful measure. Here we present an efficient algorithmic scheme to compute K 2 directly from a time series data and show that K 2 can be used as a more effective measure compared to D 2 for analysing practical time series involving coloured noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A real-time reverse transcription polymerase chain reaction (qRT-PCR) test for the matrix gene of type A influenza viruses was used during the 2007 Australian equine influenza (EI) outbreak in order to confirm diagnosis and, later, eradication of the virus. During the EI outbreak, horses being exported required vaccination and individual proof of freedom from EI. At the end of the outbreak, positive results were obtained from four horses destined for export, because of contamination of the samples with the vaccine. This report highlights the need for EI testing and vaccination to occur on separate days and with the collection of swabs for testing to precede vaccination.