933 resultados para Driver error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-Technical Summary Seafood CRC Project 2009/774. Harvest strategy evaluations and co-management for the Moreton Bay Trawl Fishery Principal Investigator: Dr Tony Courtney, Principal Fisheries Biologist Fisheries and Aquaculture, Agri-Science Queensland Department of Agriculture, Fisheries and Forestry Level B1, Ecosciences Precinct, Joe Baker St, Dutton Park, Queensland 4102 Email: tony.courtney@daff.qld.gov.au Project objectives: 1. Review the literature and data (i.e., economic, biological and logbook) relevant to the Moreton Bay trawl fishery. 2. Identify and prioritise management objectives for the Moreton Bay trawl fishery, as identified by the trawl fishers. 3. Undertake an economic analysis of Moreton Bay trawl fishery. 4. Quantify long-term changes to fishing power for the Moreton Bay trawl fishery. 5. Assess priority harvest strategies identified in 2 (above). Present results to, and discuss results with, Moreton Bay Seafood Industry Association (MBSIA), fishers and Fisheries Queensland. Note: Additional, specific objectives for 2 (above) were developed by fishers and the MBSIA after commencement of the project. These are presented in detail in section 5 (below). The project was an initiative of the MBSIA, primarily in response to falling profitability in the Moreton Bay prawn trawl fishery. The analyses were undertaken by a consortium of DAFF, CSIRO and University of Queensland researchers. This report adopted the Australian Standard Fish Names (http://www.fishnames.com.au/). Trends in catch and effort The Moreton Bay otter trawl fishery is a multispecies fishery, with the majority of the catch composed of Greasyback Prawns (Metapenaeus bennettae), Brown Tiger Prawns (Penaeus esculentus), Eastern King Prawns (Melicertus plebejus), squid (Uroteuthis spp., Sepioteuthis spp.), Banana Prawns (Fenneropenaeus merguiensis), Endeavour Prawns (Metapenaeus ensis, Metapenaeus endeavouri) and Moreton Bay bugs (Thenus parindicus). Other commercially important byproduct includes blue swimmer crabs (Portunus armatus), three-spot crabs (Portunus sanguinolentus), cuttlefish (Sepia spp.) and mantis shrimp (Oratosquilla spp.). Logbook catch and effort data show that total annual reported catch of prawns from the Moreton Bay otter trawl fishery has declined to 315 t in 2008 from a maximum of 901 t in 1990. The number of active licensed vessels participating in the fishery has also declined from 207 in 1991 to 57 in 2010. Similarly, fishing effort has fallen from a peak of 13,312 boat-days in 1999 to 3817 boat-days in 2008 – a 71% reduction. The declines in catch and effort are largely attributed to reduced profitability in the fishery due to increased operational costs and depressed prawn prices. The low prawn prices appear to be attributed to Australian aquacultured prawns and imported aquacultured vannamei prawns, displacing the markets for trawl-caught prawns, especially small species such as Greasyback Prawns which traditionally dominated landings in Moreton Bay. In recent years, the relatively high Australian dollar has resulted in reduced exports of Australian wild-caught prawns. This has increased supply on the domestic market which has also suppressed price increases. Since 2002, Brown Tiger Prawns have dominated annual reported landings in the Moreton Bay fishery. While total catch and effort in the bay have declined to historically low levels, the annual catch and catch rates of Brown Tiger Prawns have been at record highs in recent years. This appears to be at least partially attributed to the tiger prawn stock having recovered from excessive effort in previous decades. The total annual value of the Moreton Bay trawl fishery catch, including byproduct, is about $5 million, of which Brown Tiger Prawns account for about $2 million. Eastern King Prawns make up about 10% of the catch and are mainly caught in the bay from October to December as they migrate to offshore waters outside the bay where they contribute to a large mono-specific trawl fishery. Some of the Eastern King Prawns harvested in Moreton Bay may be growth overfished (i.e., caught below the size required to maximise yield or value), although the optimum size-at-capture was not determined in this study. Banana Prawns typically make up about 5% of the catch, but can exceed 20%, particularly following heavy rainfall. Economic analysis of the fishery From the economic survey, cash profits were, on average, positive for both fleet segments in both years of the survey. However, after the opportunity cost of capital and depreciation were taken into account, the residual owner-operator income was relatively low, and substantially lower than the average share of revenue paid to employed skippers. Consequently, owner-operators were earning less than their opportunity cost of their labour, suggesting that the fleets were economically unviable in the longer term. The M2 licensed fleet were, on average, earning similar boat cash profits as the T1/M1 fleet, although after the higher capital costs were accounted for the T1/M1 boats were earning substantially lower returns to owner-operator labour. The mean technical efficiency for the fleet as a whole was estimated to be 0.67. That is, on average, the boats were only catching 67 per cent of what was possible given their level of inputs (hours fished and hull units). Almost one-quarter of observations had efficiency scores above 0.8, suggesting a substantial proportion of the fleet are relatively efficient, but some are also relatively inefficient. Both fleets had similar efficiency distributions, with median technical efficiency score of 0.71 and 0.67 for the M2 and T1/M1 boats respectively. These scores are reasonably consistent with other studies of prawn trawl fleets in Australia, although higher average efficiency scores were found in the NSW prawn trawl fleet. From the inefficiency model, several factors were found to significantly influence vessel efficiency. These included the number of years of experience as skipper, the number of generations that the skipper’s family had been fishing and the number of years schooling. Skippers with more schooling were significantly more efficient than skippers with lower levels of schooling, consistent with other studies. Skippers who had been fishing longer were, in fact, less efficient than newer skippers. However, this was mitigated in the case of skippers whose family had been involved in fishing for several generations, consistent with other studies and suggesting that skill was passed through by families over successive generations. Both the linear and log-linear regression models of total fishing effort against the marginal profit per hour performed reasonably well, explaining between 70 and 84 per cent of the variation in fishing effort. As the models had different dependent variables (one logged and the other not logged) this is not a good basis for model choice. A better comparator is the square root of the mean square error (SMSE) expressed as a percentage of the mean total effort. On this criterion, both models performed very similarly. The linear model suggests that each additional dollar of average profits per hour in the fishery increases total effort by around 26 hours each month. From the log linear model, each percentage increase in profits per hour increases total fishing effort by 0.13 per cent. Both models indicate that economic performance is a key driver of fishing effort in the fishery. The effect of removing the boat-replacement policy is to increase individual vessel profitability, catch and effort, but the overall increase in catch is less than that removed by the boats that must exit the fishery. That is, the smaller fleet (in terms of boat numbers) is more profitable but the overall catch is not expected to be greater than before. This assumes, however, that active boats are removed, and that these were also taking an average level of catch. If inactive boats are removed, then catch of the remaining group as a whole could increase by between 14 and 17 per cent depending on the degree to which costs are reduced with the new boats. This is still substantially lower than historical levels of catch by the fleet. Fishing power analyses An analysis of logbook data from 1988 to 2010, and survey information on fishing gear, was performed to estimate the long-term variation in the fleet’s ability to catch prawns (known as fishing power) and to derive abundance estimates of the three most commercially important prawn species (i.e., Brown Tiger, Eastern King and Greasyback Prawns). Generalised linear models were used to explain the variation in catch as a function of effort (i.e., hours fished per day), vessel and gear characteristics, onboard technologies, population abundance and environmental factors. This analysis estimated that fishing power associated with Brown Tiger and Eastern King Prawns increased over the past 20 years by 10–30% and declined by approximately 10% for greasybacks. The density of tiger prawns was estimated to have almost tripled from around 0.5 kg per hectare in 1988 to 1.5 kg/ha in 2010. The density of Eastern King Prawns was estimated to have fluctuated between 1 and 2 kg per hectare over this time period, without any noticeable overall trend, while Greasyback Prawn densities were estimated to have fluctuated between 2 and 6 kg per hectare, also without any distinctive trend. A model of tiger prawn catches was developed to evaluate the impact of fishing on prawn survival rates in Moreton Bay. The model was fitted to logbook data using the maximum-likelihood method to provide estimates of the natural mortality rate (0.038 and 0.062 per week) and catchability (which can be defined as the proportion of the fished population that is removed by one unit of effort, in this case, estimated to be 2.5 ± 0.4 E-04 per boat-day). This approach provided a method for industry and scientists to develop together a realistic model of the dynamics of the fishery. Several aspects need to be developed further to make this model acceptable to industry. Firstly, there is considerable evidence to suggest that temperature influences prawn catchability. This ecological effect should be incorporated before developing meaningful harvest strategies. Secondly, total effort has to be allocated between each species. Such allocation of effort could be included in the model by estimating several catchability coefficients. Nevertheless, the work presented in this report is a stepping stone towards estimating essential fishery parameters and developing representative mathematical models required to evaluate harvest strategies. Developing a method that allowed an effective discussion between industry, management and scientists took longer than anticipated. As a result, harvest strategy evaluations were preliminary and only included the most valuable species in the fishery, Brown Tiger Prawns. Additional analyses and data collection, including information on catch composition from field sampling, migration rates and recruitment, would improve the modelling. Harvest strategy evaluations As the harvest strategy evaluations are preliminary, the following results should not be adopted for management purposes until more thorough evaluations are performed. The effects, of closing the fishery for one calendar month, on the annual catch and value of Brown Tiger Prawns were investigated. Each of the 12 months (i.e., January to December) was evaluated. The results were compared against historical records to determine the magnitude of gain or loss associated with the closure. Uncertainty regarding the trawl selectivity was addressed using two selectivity curves, one with a weight at 50% selection (S50%) of 7 g, based on research data, and a second with S50% of 14 g, put forward by industry. In both cases, it was concluded that any monthly closure after February would not be beneficial to the industry. The magnitude of the benefit of closing the fishery in either January or February was sensitive to which mesh selectivity curve that was assumed, with greater benefit achieved when the smaller selectivity curve (i.e., S50% = 7 g) was assumed. Using the smaller selectivity (S50% = 7 g), the expected increase in catch value was 10–20% which equates to $200,000 to $400,000 annually, while the larger selectivity curve (S50% = 14 g) suggested catch value would be improved by 5–10%, or $100,000 to $200,000. The harvest strategy evaluations showed that greater benefits, in the order of 30–60% increases in the tiger annual catch value, could have been obtained by closing the fishery early in the year when annual effort levels were high (i.e., > 10,000 boat-days). In recent years, as effort levels have declined (i.e., ~4000 boat-days annually), expected benefits from such closures are more modest. In essence, temporal closures offer greater benefit when fishing mortality rates are high. A spatial analysis of Brown Tiger Prawn catch and effort was also undertaken to obtain a better understanding of the prawn population dynamics. This indicated that, to improve profitability of the fishery, fishers could consider closing the fishery in the period from June to October, which is already a period of low profitability. This would protect the Brown Tiger Prawn spawning stock, increase catch rates of all species in the lucrative pre-Christmas period (November–December), and provide fishers with time to do vessel maintenance, arrange markets for the next season’s harvest, and, if they wish, work at other jobs. The analysis found that the instantaneous rate of total mortality (Z) for the March–June period did not vary significantly over the last two decades. As the Brown Tiger Prawn population in Moreton Bay has clearly increased over this time period, an interesting conclusion is that the instantaneous rate of natural mortality (M) must have increased, suggesting that tiger prawn natural mortality may be density-dependent at this time of year. Mortality rates of tiger prawns for June–October were found to have decreased over the last two decades, which has probably had a positive effect on spawning stocks in the October–November spawning period. Abiotic effects on the prawns The influence of air temperature, rainfall, freshwater flow, the southern oscillation index (SOI) and lunar phase on the catch rates of the four main prawn species were investigated. The analyses were based on over 200,000 daily logbook catch records over 23 years (i.e., 1988–2010). Freshwater flow was more influential than rainfall and SOI, and of the various sources of flow, the Brisbane River has the greatest volume and influence on Moreton Bay prawn catches. A number of time-lags were also considered. Flow in the preceding month prior to catch (i.e., 30 days prior, Logflow1_30) and two months prior (31–60 days prior, Logflow31_60) had strong positive effects on Banana Prawn catch rates. Average air temperature in the preceding 4-6 months (Temp121_180) also had a large positive effect on Banana Prawn catch rates. Flow in the month immediately preceding catch (Logflow1_30) had a strong positive influence on Greasyback Prawn catch rates. Air temperature in the preceding two months prior to catch (Temp1_60) had a large positive effect on Brown Tiger Prawn catch rates. No obvious or marked effects were detected for Eastern King Prawns, although interestingly, catch rates declined with increasing air temperature 4–6 months prior to catch. As most Eastern King Prawn catches in Moreton Bay occur in October to December, the results suggest catch rates decline with increasing winter temperatures. In most cases, the prawn catch rates declined with the waxing lunar phase (high luminance/full moon), and increased with the waning moon (low luminance/new moon). The SOI explains little additional variation in prawn catch rates (~ <2%), although its influence was higher for Banana Prawns. Extrapolating findings of the analyses to long-term climate change effects should be interpreted with caution. That said, the results are consistent with likely increases in abundance in the region for the two tropical species, Banana Prawns and Brown Tiger Prawns, as coastal temperatures rise. Conversely, declines in abundance could be expected for the two temperate species, Greasyback and Eastern King Prawns. Corporate management structures An examination of alternative governance systems was requested by the industry at one of the early meetings, particularly systems that may give them greater autonomy in decision making as well as help improve the marketing of their product. Consequently, a review of alternative management systems was undertaken, with a particular focus on the potential for self-management of small fisheries (small in terms of number of participants) and corporate management. The review looks at systems that have been implemented or proposed for other small fisheries internationally, with a particular focus on self-management as well as the potential benefits and challenges for corporate management. This review also highlighted particular opportunities for the Moreton Bay prawn fishery. Corporate management differs from other co-management and even self-management arrangements in that ‘ownership’ of the fishery is devolved to a company in which fishers and government are shareholders. The company manages the fishery as well as coordinates marketing to ensure that the best prices are received and that the catch taken meets the demands of the market. Coordinated harvesting will also result in increased profits, which are returned to fishers in the form of dividends. Corporate management offers many of the potential benefits of an individual quota system without formally implementing such a system. A corporate management model offers an advantage over a self-management model in that it can coordinate both marketing and management to take advantage of this unique geographical advantage. For such a system to be successful, the fishery needs to be relatively small and self- contained. Small in this sense is in terms of number of operators. The Moreton Bay prawn fishery satisfies these key conditions for a successful self-management and potentially corporate management system. The fishery is small both in terms of number of participants and geography. Unlike other fisheries that have progressed down the self-management route, the key market for the product from the Moreton Bay fishery is right at its doorstep. Corporate management also presents a number of challenges. First, it will require changes in the way fishers operate. In particular, the decision on when to fish and what to catch will be taken away from the individual and decided by the collective. Problems will develop if individuals do not join the corporation but continue to fish and market their own product separately. While this may seem an attractive option to fishers who believe they can do better independently, this is likely to be just a short- term advantage with an overall long-run cost to themselves as well as the rest of the industry. There are also a number of other areas that need further consideration, particularly in relation to the allocation of shares, including who should be allocated shares (e.g. just boat owners or also some employed skippers). Similarly, how harvesting activity is to be allocated by the corporation to the fishers. These are largely issues that cannot be answered without substantial consultation with those likely to be affected, and these groups cannot give these issues serious consideration until the point at which they are likely to become a reality. Given the current structure and complexity of the fishery, it is unlikely that such a management structure will be feasible in the short term. However, the fishery is a prime candidate for such a model, and development of such a management structure in the future should be considered as an option for the longer term.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A residual-based strategy to estimate the local truncation error in a finite volume framework for steady compressible flows is proposed. This estimator, referred to as the -parameter, is derived from the imbalance arising from the use of an exact operator on the numerical solution for conservation laws. The behaviour of the residual estimator for linear and non-linear hyperbolic problems is systematically analysed. The relationship of the residual to the global error is also studied. The -parameter is used to derive a target length scale and consequently devise a suitable criterion for refinement/derefinement. This strategy, devoid of any user-defined parameters, is validated using two standard test cases involving smooth flows. A hybrid adaptive strategy based on both the error indicators and the -parameter, for flows involving shocks is also developed. Numerical studies on several compressible flow cases show that the adaptive algorithm performs excellently well in both two and three dimensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple error detecting and correcting procedure is described for nonbinary symbol words; here, the error position is located using the Hamming method and the correct symbol is substituted using a modulo-check procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing body of research is concerned with deviance in the workplace. While much research has explored negative forms of deviance, we examine constructive deviance: behaviour that deviates from salient norms and benefits the reference group. We empirically explore manifestations, determinants and performance outcomes of constructive deviance in standardised work processes. We do this through a mixed-methods study in bakery trading departments of an Australian retailer. We illustrate that constructive deviance occurs in these settings and show that some manifestations of constructive deviance improve organisational performance and pave the way for applying constructive deviance as a strategic tool in retail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an innovative approach to modelling the causal relationships of human errors in rail crack incidents (RCI) from a managerial perspective. A Bayesian belief network is developed to model RCI by considering the human errors of designers, manufactures, operators and maintainers (DMOM) and the causal relationships involved. A set of dependent variables whose combinations express the relevant functions performed by each DMOM participant is used to model the causal relationships. A total of 14 RCI on Hong Kong’s mass transit railway (MTR) from 2008 to 2011 are used to illustrate the application of the model. Bayesian inference is used to conduct an importance analysis to assess the impact of the participants’ errors. Sensitivity analysis is then employed to gauge the effect the increased probability of occurrence of human errors on RCI. Finally, strategies for human error identification and mitigation of RCI are proposed. The identification of ability of maintainer in the case study as the most important factor influencing the probability of RCI implies the priority need to strengthen the maintenance management of the MTR system and that improving the inspection ability of the maintainer is likely to be an effective strategy for RCI risk mitigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Road policing is an important tool used to modify road user behaviour. While other theories, such as deterrence theory, are significant in road policing, there may be a role for using procedural justice as a framework to improve outcomes in common police citizen interactions such as traffic law enforcement. This study, using a sample of 237 young novice drivers, considered how the four elements of procedural justice (voice, neutrality, respect and trustworthiness) were perceived in relation to two forms of speed enforcement: point-to-point (or average) speed and mobile speed cameras. Only neutrality was related to both speed camera types suggesting that it may be possible to influence behaviour by emphasising one or more elements, rather than using all components of procedural justice. This study is important as it indicates that including at least some elements of procedural justice in more automated policing encounters can encourage citizen compliance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Road policing is a key method used to improve driver compliance with road laws. However, we have a very limited understanding of the perceptions of young drivers regarding police enforcement of road laws. This paper addresses this gap. Design/Methodology/Approach Within this study 238 young drivers from Queensland, Australia, aged 17-24 years (M = 18, SD = 1.54), with a provisional (intermediate) driver’s licence completed an online survey regarding their perceptions of police enforcement and their driver thrill seeking tendencies. This study considered whether these factors influenced self-reported transient (e.g., travelling speed) and fixed (e.g., blood alcohol concentration) road violations by the young drivers. Findings The results indicate that being detected by police for a traffic offence, and the frequency with which they display P-plates on their vehicle to indicate their licence status, are associated with both self-reported transient and fixed rule violations. Licence type, police avoidance behaviours and driver thrill seeking affected transient rule violations only, while perceptions of police enforcement affected fixed rule violations only. Practical implications This study suggests that police enforcement of young driver violations of traffic laws may not be as effective as expected and that we need to improve the way in which police enforce road laws for young novice drivers. Originality/value: This paper identifies that perceptions of police enforcement by young drivers does not influence all types of road offences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With technology scaling, vulnerability to soft errors in random logic is increasing. There is a need for on-line error detection and protection for logic gates even at sea level. The error checker is the key element for an on-line detection mechanism. We compare three different checkers for error detection from the point of view of area, power and false error detection rates. We find that the double sampling checker (used in Razor), is the simplest and most area and power efficient, but suffers from very high false detection rates of 1.15 times the actual error rates. We also find that the alternate approaches of triple sampling and integrate and sample method (I&S) can be designed to have zero false detection rates, but at an increased area, power and implementation complexity. The triple sampling method has about 1.74 times the area and twice the power as compared to the Double Sampling method and also needs a complex clock generation scheme. The I&S method needs about 16% more power with 0.58 times the area as double sampling, but comes with more stringent implementation constraints as it requires detection of small voltage swings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Driver fatigue has received increased attention during recent years and is now considered to be a major contributor to approximately 15–30% of all crashes. However, little is known about fatigue in city bus drivers. It is hypothesized that city bus drivers suffer from sleepiness, which is due to a combination of working conditions, lack of health and reduced sleep quantity and quality. The overall aim with the current study is to investigate if severe driver sleepiness, as indicated by subjective reports of having to fight sleep while driving, is a problem for city based bus drivers in Sweden and if so, to identify the determinants related to working conditions, health and sleep which contribute towards this. The results indicate that driver sleepiness is a problem for city bus drivers, with 19% having to fight to stay awake while driving the bus 2–3 times each week or more and nearly half experiencing this at least 2–4 times per month. In conclusion, severe sleepiness, as indicated by having to fight sleep during driving, was common among the city bus drivers. Severe sleepiness correlated with fatigue related safety risks, such as near crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A constant switching frequency current error space vector-based hysteresis controller for two-level voltage source inverter-fed induction motor (IM) drives is proposed in this study. The proposed controller is capable of driving the IM in the entire speed range extending to the six-step mode. The proposed controller uses the parabolic boundary, reported earlier, for vector selection in a sector, but uses simple, fast and self-adaptive sector identification logic for sector change detection in the entire modulation range. This new scheme detects the sector change using the change in direction of current error along the axes jA, jB and jC. Most of the previous schemes use an outer boundary for sector change detection. So the current error goes outside the boundary six times during sector change, in one cycle,, introducing additional fifth and seventh harmonic components in phase current. This may cause sixth harmonic torque pulsations in the motor and spread in the harmonic spectrum of phase voltage. The proposed new scheme detects the sector change fast and accurately eliminating the chance of introducing additional fifth and seventh harmonic components in phase current and provides harmonic spectrum of phase voltage, which exactly matches with that of constant switching frequency voltage-controlled space vector pulse width modulation (VC-SVPWM)-based two-level inverter-fed drives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In handling large volumes of data such as chemical notations, serial numbers for books, etc., it is always advisable to provide checking methods which would indicate the presence of errors. The entire new discipline of coding theory is devoted to the study of the construction of codes which provide such error-detecting and correcting means.l Although these codes are very powerful, they are highly sophisticated from the point of view of practical implementation