156 resultados para Derivations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new accelerometer, the Kenz Lifecorder EX (LC; Suzuken Co. Ltd, Nagoya, Japan), offers promise as a feasible monitor alternative to the commonly used Actigraph (AG: Actigraph LLC, Fort Walton Beach, FL). Purpose: This study compared the LC and AG accelerometers and the Yamax SW-200 pedometer (DW) under free-living conditions with regard to children's steps taken and time in light-intensity physical activity (PA) and moderate to vigorous PA (MVPA). Methods: Participants (N = 31, age = 10.2 ± 0.4 yr) wore LC, AG, and DW monitors from arrival at school (7:45 a.m.) until they went to bed. Time in light and MVPA intensities were calculated using two separate intensity classifications for the LC (LC_4 and LC_5) and four classifications for the AG (AG_Treuth, AG_Puyau, AG_Trost, and AG_Freedson). Both accelerometers provided steps as outputs. DW steps were self-recorded. Repeated-measures ANOVA was used to assess overlapping monitor outputs. Results: There was no difference between DW and LC steps (Δ = 200 steps), but a nonsignificant trend was observed in the pairwise comparison between DW and AG steps (Δ = 1001 steps, P = 0.058). AG detected significantly greater steps than the LC (Δ = 801 steps, P = 0.001). Estimates of light-intensity activity minutes ranged from a low of 75.6 ± 18.4 min (LC_4) to a high of 309 ± 69.2 min (AG_Treuth). Estimates of MVPA minutes ranged from a low of 25.9 ± 9.4 min (LC_5) to a high of 112.2 ± 34.5 min (AG_Freedson). No significant differences in MVPA were seen between LC_5 and AG_Treuth (Δ = 4.9 min) or AG_Puyau (Δ = 1.7 min). Conclusion: The LC detected a comparable number of steps as the DW but significantly fewer steps than the AG in children. Current results indicate that the LC_5 and either AG_Treuth or AG_Puyau intensity derivations provide similar mean estimates of time in MVPA during-free living activity in 10-yr-old children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shaky Ground was a solo exhibition of works by Charles Robb held at Ryan Renshaw gallery, Brisbane in 2012. The exhibition comprised three sculptural works: a white rotating roundel with a drawing of the artist as seen from above; an artificial rock with a spinning aniseed ball nestled in one of its fissures; and a sculptural portrait of the artist dressed in a protective dust suit which was mounted perpendicular to the wall. The works were derivations or reorientations of previously exhibited work and established an ambiguous field of associations with each other based on formal characteristics or their proximity to the production site and processes. In so doing, the work formed part of the artist's ongoing exploration of sculpture, subjectivity and autogenous approaches to art practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable robotic perception and planning are critical to performing autonomous actions in uncertain, unstructured environments. In field robotic systems, automation is achieved by interpreting exteroceptive sensor information to infer something about the world. This is then mapped to provide a consistent spatial context, so that actions can be planned around the predicted future interaction of the robot and the world. The whole system is as reliable as the weakest link in this chain. In this paper, the term mapping is used broadly to describe the transformation of range-based exteroceptive sensor data (such as LIDAR or stereo vision) to a fixed navigation frame, so that it can be used to form an internal representation of the environment. The coordinate transformation from the sensor frame to the navigation frame is analyzed to produce a spatial error model that captures the dominant geometric and temporal sources of mapping error. This allows the mapping accuracy to be calculated at run time. A generic extrinsic calibration method for exteroceptive range-based sensors is then presented to determine the sensor location and orientation. This allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. The mathematical derivations at the core of this model are not particularly novel or complicated, but the rigorous analysis and application to field robotics seems to be largely absent from the literature to date. The techniques in this paper are simple to implement, and they offer a significant improvement to the accuracy, precision, and integrity of mapped information. Consequently, they should be employed whenever maps are formed from range-based exteroceptive sensor data. © 2009 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The motion of marine vessels has traditionally been studied using two different approaches: manoeuvring and seakeeping. These two approaches use different reference frames and coordinate systems to describe the motion. This paper derives the kinematic models that characterize the transformation of motion variables (position, velocity, accelerations) and forces between the different coordinate systems used in these theories. The derivations hereby presented are done in terms of the formalism adopted in robotics. The advantage of this formulation is the use of matrix notation and operations. As an application, the transformation of linear equations of motion used in seakeeping into body-fixed coordinates is considered for both zero and forward speed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the extension of the work of the preceding paper, the relativistic front form for Maxwell's equations for electromagnetism is developed and shown to be particularly suited to the description of paraxial waves. The generators of the Poincaré group in a form applicable directly to the electric and magnetic field vectors are derived. It is shown that the effect of a thin lens on a paraxial electromagnetic wave is given by a six-dimensional transformation matrix, constructed out of certain special generators of the Poincaré group. The method of construction guarantees that the free propagation of such waves as well as their transmission through ideal optical systems can be described in terms of the metaplectic group, exactly as found for scalar waves by Bacry and Cadilhac. An alternative formulation in terms of a vector potential is also constructed. It is chosen in a gauge suggested by the front form and by the requirement that the lens transformation matrix act locally in space. Pencils of light with accompanying polarization are defined for statistical states in terms of the two-point correlation function of the vector potential. Their propagation and transmission through lenses are briefly considered in the paraxial limit. This paper extends Fourier optics and completes it by formulating it for the Maxwell field. We stress that the derivations depend explicitly on the "henochromatic" idealization as well as the identification of the ideal lens with a quadratic phase shift and are heuristic to this extent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A formal chemical nomenclature system WISENOM based on a context-free grammar and graph coding is described. The system is unique, unambiguous, easily pronounceable, encodable, and decodable for organic compounds. Being a formal system, every name is provable as a theorem or derivable as a terminal sentence by using the basic axioms and rewrite rules. The syntax in Backus-Naur form, examples of name derivations, and the corresponding derivation trees are provided. Encoding procedures to convert connectivity tables to WISENOM, parsing, and decoding are described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents a new approach to improve the detection and tracking performance of a track-while-scan (TWS) radar. The contribution consists of three parts. In Part 1 the scope of various papers in this field is reviewed. In Part 2, a new approach for integrating the detection and tracking functions is presented. It shows how a priori information from the TWS computer can be used to improve detection. A new multitarget tracking algorithm has also been developed. It is specifically oriented towards solving the combinatorial problems in multitarget tracking. In Part 3, analytical derivations are presented for quantitatively assessing, a priori, the performance of a track-while-scan radar system (true track initiation, false track initiation, true track continuation and false track deletion characteristics). Simulation results are also shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents, in three parts, a new approach to improve the detection and tracking performance of a track-while-scan radar. Part 1 presents a review of the current status of the subject. Part 2 details the new approach. It shows how a priori information provided by the tracker can be used to improve detection. It also presents a new multitarget tracking algorithm. In the present Part, analytical derivations are presented for assessing, a priori, the performance of the TWS radar system. True track initiation, false track initiation, true track continuation and false track deletion characteristics have been studied. It indicates how the various thresholds can be chosen by the designer to optimise performance. Simulation results are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

he paper presents, in three parts, a new approach to improve the detection and tracking performance of a track-while-scan (TWS) radar. Part 1 presents a review of current status. In this part, Part 2, it is shown how the detection can be improved by utilising information from tracker. A new multitarget tracking algorithm, capable of tracking manoeuvring targets in clutter, is then presented. The algorithm is specifically tailored so that the solution to the combinatorial problem presented in a companion paper can be applied. The implementation aspects are discussed and a multiprocessor architecture identified to realise the full potential of the algorithm. Part 3 presents analytical derivations for quantitative assessment of the performance of the TWS radar system. It also shows how the performance can be optimised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent axiomatic derivations of the maximum entropy principle from consistency conditions are critically examined. We show that proper application of consistency conditions alone allows a wider class of functionals, essentially of the form ∝ dx p(x)[p(x)/g(x)] s , for some real numbers, to be used for inductive inference and the commonly used form − ∝ dx p(x)ln[p(x)/g(x)] is only a particular case. The role of the prior densityg(x) is clarified. It is possible to regard it as a geometric factor, describing the coordinate system used and it does not represent information of the same kind as obtained by measurements on the system in the form of expectation values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Placzek [1] was the first to derive general expressions for the intensities of overtones in case of Raman scattering. He assumed electrical anharmonicity. However, he left the expressions for the derivations of the polarizability tensor undetermined. In 1941, a classical and semiempirical theory was developed by Wolkenstein [2]. He assumed the validity of the additivity of bond polarizabilities. However, the expressions derived by him for the intensities of overtones remain yet to be verified. It is the purpose of this paper to derive a formula for Raman polarizability tensor for overtones of (intramolecular) vibrational spectra along the lines of Kondilenko et al. [3,4].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comparison is made between German and Russian terminological derivations in chemistry and the methods used by Germans and Russians to solve problems related to the fornlrrtion of scientific words. A study of this comparison, it is believed, can help us in the development of scientific words in Indian languages.