933 resultados para Archivos LOG
Resumo:
Motorcyclists are the most crash-prone road-user group in many Asian countries including Singapore; however, factors influencing motorcycle crashes are still not well understood. This study examines the effects of various roadway characteristics, traffic control measures and environmental factors on motorcycle crashes at different location types including expressways and intersections. Using techniques of categorical data analysis, this study has developed a set of log-linear models to investigate multi-vehicle motorcycle crashes in Singapore. Motorcycle crash risks in different circumstances have been calculated after controlling for the exposure estimated by the induced exposure technique. Results show that night-time influence increases crash risks of motorcycles particularly during merging and diverging manoeuvres on expressways, and turning manoeuvres at intersections. Riders appear to exercise more care while riding on wet road surfaces particularly during night. Many hazardous interactions at intersections tend to be related to the failure of drivers to notice a motorcycle as well as to judge correctly the speed/distance of an oncoming motorcycle. Road side conflicts due to stopping/waiting vehicles and interactions with opposing traffic on undivided roads have been found to be as detrimental factors on motorcycle safety along arterial, main and local roads away from intersections. Based on the findings of this study, several targeted countermeasures in the form of legislations, rider training, and safety awareness programmes have been recommended.
Resumo:
Mobile devices are becoming indispensable personal assistants in people's daily life as these devices support work, study, play and socializing activities. The multi-modal sensors and rich features of smartphones can capture abundant information about users' life experience, such as taking photos or videos on what they see and hear, and organizing their tasks and activities using calendar, to-do lists, and notes. Such vast information can become useful to help users recalling episodic memories and reminisce about meaningful experiences. In this paper, we propose to apply autobiographical memory framework to provide an effective mechanism to structure mobile life-log data. The proposed model is an attempt towards a more complete personal life-log indexing model, which will support long term capture, organization, and retrieval. To demonstrate the benefits of the proposed model, we propose some design solutions for enabling users-driven capture, annotation, and retrieval of autobiographical multimedia chronicles tools.
Resumo:
This study uses borehole geophysical log data of sonic velocity and electrical resistivity to estimate permeability in sandstones in the northern Galilee Basin, Queensland. The prior estimates of permeability are calculated according to the deterministic log–log linear empirical correlations between electrical resistivity and measured permeability. Both negative and positive relationships are influenced by the clay content. The prior estimates of permeability are updated in a Bayesian framework for three boreholes using both the cokriging (CK) method and a normal linear regression (NLR) approach to infer the likelihood function. The results show that the mean permeability estimated from the CK-based Bayesian method is in better agreement with the measured permeability when a fairly apparent linear relationship exists between the logarithm of permeability and sonic velocity. In contrast, the NLR-based Bayesian approach gives better estimates of permeability for boreholes where no linear relationship exists between logarithm permeability and sonic velocity.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
To this day, realizations in the standard-model of (lossy) trapdoor functions from discrete-log-type assumptions require large public key sizes, e.g., about Θ(λ 2) group elements for a reduction from the decisional Diffie-Hellman assumption (where λ is a security parameter). We propose two realizations of lossy trapdoor functions that achieve public key size of only Θ(λ) group elements in bilinear groups, with a reduction from the decisional Bilinear Diffie-Hellman assumption. Our first construction achieves this result at the expense of a long common reference string of Θ(λ 2) elements, albeit reusable in multiple LTDF instantiations. Our second scheme also achieves public keys of size Θ(λ), entirely in the standard model and in particular without any reference string, at the cost of a slightly more involved construction. The main technical novelty, developed for the second scheme, is a compact encoding technique for generating compressed representations of certain sequences of group elements for the public parameters.
Resumo:
Objective While many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Method Parents of learner drivers in two Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child’s learner log book and the effectiveness of the log book system. Results The study indicates that the large majority of parents believe that their child’s learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. Conclusions The results of this study suggest the presence of a paradox with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child’s log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.
Resumo:
In the commercial food industry, demonstration of microbiological safety and thermal process equivalence often involves a mathematical framework that assumes log-linear inactivation kinetics and invokes concepts of decimal reduction time (DT), z values, and accumulated lethality. However, many microbes, particularly spores, exhibit inactivation kinetics that are not log linear. This has led to alternative modeling approaches, such as the biphasic and Weibull models, that relax strong log-linear assumptions. Using a statistical framework, we developed a novel log-quadratic model, which approximates the biphasic and Weibull models and provides additional physiological interpretability. As a statistical linear model, the log-quadratic model is relatively simple to fit and straightforwardly provides confidence intervals for its fitted values. It allows a DT-like value to be derived, even from data that exhibit obvious "tailing." We also showed how existing models of non-log-linear microbial inactivation, such as the Weibull model, can fit into a statistical linear model framework that dramatically simplifies their solution. We applied the log-quadratic model to thermal inactivation data for the spore-forming bacterium Clostridium botulinum and evaluated its merits compared with those of popular previously described approaches. The log-quadratic model was used as the basis of a secondary model that can capture the dependence of microbial inactivation kinetics on temperature. This model, in turn, was linked to models of spore inactivation of Sapru et al. and Rodriguez et al. that posit different physiological states for spores within a population. We believe that the log-quadratic model provides a useful framework in which to test vitalistic and mechanistic hypotheses of inactivation by thermal and other processes. Copyright © 2009, American Society for Microbiology. All Rights Reserved.
Resumo:
This paper addresses the problem of identifying and explaining behavioral differences between two business process event logs. The paper presents a method that, given two event logs, returns a set of statements in natural language capturing behavior that is present or frequent in one log, while absent or infrequent in the other. This log delta analysis method allows users to diagnose differences between normal and deviant executions of a process or between two versions or variants of a process. The method relies on a novel approach to losslessly encode an event log as an event structure, combined with a frequency-enhanced technique for differencing pairs of event structures. A validation of the proposed method shows that it accurately diagnoses typical change patterns and can explain differences between normal and deviant cases in a real-life log, more compactly and precisely than previously proposed methods.
Resumo:
In structural brain MRI, group differences or changes in brain structures can be detected using Tensor-Based Morphometry (TBM). This method consists of two steps: (1) a non-linear registration step, that aligns all of the images to a common template, and (2) a subsequent statistical analysis. The numerous registration methods that have recently been developed differ in their detection sensitivity when used for TBM, and detection power is paramount in epidemological studies or drug trials. We therefore developed a new fluid registration method that computes the mappings and performs statistics on them in a consistent way, providing a bridge between TBM registration and statistics. We used the Log-Euclidean framework to define a new regularizer that is a fluid extension of the Riemannian elasticity, which assures diffeomorphic transformations. This regularizer constrains the symmetrized Jacobian matrix, also called the deformation tensor. We applied our method to an MRI dataset from 40 fraternal and identical twins, to revealed voxelwise measures of average volumetric differences in brain structure for subjects with different degrees of genetic resemblance.
Resumo:
The standard curve-fitting methods, Casagrande's log t method and Taylor's root t method, for the determination of the coefficient of consolidation use the later part of the consolidation curve and are influenced by secondary compression effects. Literature shows that secondary compression is concurrent with primary consolidation and that its effect is to decrease the value of the coefficient of consolidation. If the early part of the time-compression data is used, the values obtained will be less influenced by secondary compression effects. A method that uses the early part of the log t plot is proposed in this technical note. As the influence of secondary compression is reduced, the value obtained by this method is greater than that yielded by both the standard methods. The permeability values computed from C-v (obtained from the proposed method) rue more in agreement with the measured values than the standard methods showing that the effects of secondary compression are minimized. Time-compression data for a shorter duration is sufficient for the determination of C-v if the coefficient of secondary compression is not required.
Resumo:
Sawing studies of mixed species and age plantation hardwoods representing potential early resource flows available to plantation sawlog processing industry. Provide information needed to underpin a business case to invest in small log sawing infrastucture.
Resumo:
Improved information on the product quality of the plantation resource is needed to allow businesses to consider investing in the development of value-adding processing facilities. These facilities are likely to require customised design that optimises the utilisation of future small diameter plantation hardwood logs. This log resource will become available as wood supply in Queensland transitions from native forests to 100% from sustainable plantations. This resource will be controlled by plantations established prior to 2000. A survey of the three main growers (former Forest Enterprises Australia Pty Ltd, former Forestry Corporation of New South Wales, Hancock Queensland Plantation Pty Ltd) revealed that C. citriodora subsp.variegata – CCV (28.0%), Eucalyptus dunnii (27.5%), E. pilularis (23.0%), E. grandis (11.3%) and E. cloeziana –GMS (7.1%) were the most widely planted species in the southern Queensland and northern New South Wales subtropical hardwood estate and would potentially dominate the supply of plantation hardwoods to sawmill processing facilities.
Resumo:
High levels of percentage green veneer recovery can be obtained from temperate eucalypt plantations. Recovery traits are affected by site and log position in the stem. Of the post-felling log traits studied, out-of-roundness was the best predictor of green recovery.
Resumo:
Key message Log-end splitting is one of the single most important defects in veneer logs. We show that log-end splitting in the temperate plantation species Eucalyptus nitens varies across sites and within-tree log position and increases with time in storage. Context Log-end splitting is one of the single most important defects in veneer logs because it can substantially reduce the recovery of veneer sheets. Eucalyptus nitens can develop log-end splits, but factors affecting log-end splitting in this species are not well understood. Aims The present study aims to describe the effect of log storage and steaming on the development of log-end splitting in logs from different plantations and log positions within the tree. Methods The study was conducted on upper and lower logs from each of 41 trees from three 20–22-year-old Tasmanian E. nitens plantations. Log-end splitting was assessed immediately after felling, after transport and storage in a log-yard, and just before peeling. A pre-peeling steam treatment was applied to half the logs. Results Site had a significant effect on splitting, and upper logs split more than lower logs with storage. Splitting increased with tree diameter breast height (DBH), but this relationship varied with site. The most rapidly growing site had more splitting even after accounting for DBH. No significant effect of steaming was detected. Conclusion Log-end splitting varied across sites and within-tree log position and increased with time in storage.