856 resultados para Time-to-collision


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the last few years, football entered in a period of accelerated access to large amount of match analysis data. Social networks have been adopted to reveal the structure and organization of the web of interactions, such as the players passing distribution tendencies. In this study we investigated the influence of ball possession characteristics in the competitive success of Spanish La Liga teams. The sample was composed by OPTA passing distribution raw data (n=269,055 passes) obtained from 380 matches involving all the 20 teams of the 2012/2013 season. Then, we generated 760 adjacency matrixes and their corresponding social networks using Node XL software. For each network we calculated three team performance measures to evaluate ball possession tendencies: graph density, average clustering and passing intensity. Three levels of competitive success were determined using two-step cluster analysis based on two input variables: the total points scored by each team and the scored per conceded goals ratio. Our analyses revealed significant differences between competitive performances on all the three team performance measures (p < .001). Bottom-ranked teams had less number of connected players (graph density) and triangulations (average clustering) than intermediate and top-ranked teams. However, all the three clusters diverged in terms of passing intensity, with top-ranked teams having higher number of passes per possession time, than intermediate and bottom-ranked teams. Finally, similarities and dissimilarities in team signatures of play between the 20 teams were displayed using Cohen’s effect size. In sum, findings suggest the competitive performance was influenced by the density and connectivity of the teams, mainly due to the way teams use their possession time to give intensity to their game.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.

The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.

We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.

The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.

To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.

A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.

One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

World War II profoundly impacted Florida. The military geography of the State is essential to an understanding the war. The geostrategic concerns of place and space determined that Florida would become a statewide military base. Florida’s attributes of place such as climate and topography determined its use as a military academy hosting over two million soldiers, nearly 15 percent of the GI Army, the largest force theUS ever raised. One-in-eight Floridians went into uniform. Equally,Florida’s space on the planet made it central for both defensive and offensive strategies. The Second World War was a war of movement, and Florida was a major jump off point forUSforce projection world-wide, especially of air power. Florida’s demography facilitated its use as a base camp for the assembly and engagement of this military power. In 1940, less than two percent of the US population lived in Florida, a quiet, barely populated backwater of the United States.[1] But owing to its critical place and space, over the next few years it became a 65,000 square mile training ground, supply dump, and embarkation site vital to the US war effort. Because of its place astride some of the most important sea lanes in the Atlantic World,Florida was the scene of one of the few Western Hemisphere battles of the war. The militarization ofFloridabegan long before Pearl Harbor. The pre-war buildup conformed to theUSstrategy of the war. The strategy of theUS was then (and remains today) one of forward defense: harden the frontier, then take the battle to the enemy, rather than fight them inNorth America. The policy of “Europe First,” focused the main US war effort on the defeat of Hitler’sGermany, evaluated to be the most dangerous enemy. In Florida were established the military forces requiring the longest time to develop, and most needed to defeat the Axis. Those were a naval aviation force for sea-borne hostilities, a heavy bombing force for reducing enemy industrial states, and an aerial logistics train for overseas supply of expeditionary campaigns. The unique Florida coastline made possible the seaborne invasion training demanded for USvictory. The civilian population was employed assembling mass-produced first-generation container ships, while Floridahosted casualties, Prisoners-of-War, and transient personnel moving between the Atlantic and Pacific. By the end of hostilities and the lifting of Unlimited Emergency, officially on December 31, 1946, Floridahad become a transportation nexus. Florida accommodated a return of demobilized soldiers, a migration of displaced persons, and evolved into a modern veterans’ colonia. It was instrumental in fashioning the modern US military, while remaining a center of the active National Defense establishment. Those are the themes of this work. [1] US Census of Florida 1940. Table 4 – Race, By Nativity and Sex, For the State. 14.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the past decade, there has been a dramatic increase by postsecondary institutions in providing academic programs and course offerings in a multitude of formats and venues (Biemiller, 2009; Kucsera & Zimmaro, 2010; Lang, 2009; Mangan, 2008). Strategies pertaining to reapportionment of course-delivery seat time have been a major facet of these institutional initiatives; most notably, within many open-door 2-year colleges. Often, these enrollment-management decisions are driven by the desire to increase market-share, optimize the usage of finite facility capacity, and contain costs, especially during these economically turbulent times. So, while enrollments have surged to the point where nearly one in three 18-to-24 year-old U.S. undergraduates are community college students (Pew Research Center, 2009), graduation rates, on average, still remain distressingly low (Complete College America, 2011). Among the learning-theory constructs related to seat-time reapportionment efforts is the cognitive phenomenon commonly referred to as the spacing effect, the degree to which learning is enhanced by a series of shorter, separated sessions as opposed to fewer, more massed episodes. This ex post facto study explored whether seat time in a postsecondary developmental-level algebra course is significantly related to: course success; course-enrollment persistence; and, longitudinally, the time to successfully complete a general-education-level mathematics course. Hierarchical logistic regression and discrete-time survival analysis were used to perform a multi-level, multivariable analysis of a student cohort (N = 3,284) enrolled at a large, multi-campus, urban community college. The subjects were retrospectively tracked over a 2-year longitudinal period. The study found that students in long seat-time classes tended to withdraw earlier and more often than did their peers in short seat-time classes (p < .05). Additionally, a model comprised of nine statistically significant covariates (all with p-values less than .01) was constructed. However, no longitudinal seat-time group differences were detected nor was there sufficient statistical evidence to conclude that seat time was predictive of developmental-level course success. A principal aim of this study was to demonstrate—to educational leaders, researchers, and institutional-research/business-intelligence professionals—the advantages and computational practicability of survival analysis, an underused but more powerful way to investigate changes in students over time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Consumer Finance Division of the South Carolina State Board of Financial Institutions is responsible for the supervision, licensing and examination of all consumer finance companies, deferred presentment companies, check cashing companies, and non-depository mortgage lenders and their loan originators. This project specifically focuses on the licensing of Mortgage Lender/Servicer ( company), Mortgage Lender/Servicer Branch (branch) and Mortgage Loan Originator (loan originator) licenses. The problem statement is how the Division can handle increasing the number of mortgage loan originators in the state without delaying the time to process applications. The goal of this project is to make the current licensing process more efficient so that the Division can handle the increased workload without having to hire additional personnel.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new method for estimating the time to colonization of Methicillin-resistant Staphylococcus Aureus (MRSA) patients is developed in this paper. The time to colonization of MRSA is modelled using a Bayesian smoothing approach for the hazard function. There are two prior models discussed in this paper: the first difference prior and the second difference prior. The second difference prior model gives smoother estimates of the hazard functions and, when applied to data from an intensive care unit (ICU), clearly shows increasing hazard up to day 13, then a decreasing hazard. The results clearly demonstrate that the hazard is not constant and provide a useful quantification of the effect of length of stay on the risk of MRSA colonization which provides useful insight.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It seems Australia is the place to be at the moment when it comes to making and creating horror films. Mark David Ryan explains how you can get involved in this boom industry. If you’re a writer with a passion for scary movies, a wordsmith who watches the occasional horror flick and writing your own has crossed your mind, or a writer who terrifies readers with an interest in screenwriting, then there has never been a better time to write Aussie horror flicks. This article introduces the horror genre’s core characteristics, issues to consider when crafting a horror flick, and provides tips for getting scripts into the hands of producers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is questionable whether activities like construction, including maintenance and repair, can be considered a single entity or industry - on the basis that different sectors of construction/maintenance use fundamentally distinct resource and skill bases. This creates a number of issues including the development of competition and reform policy. de Valance deployed the Structure-Conduct-Performance model (SCP) to delineate sectors of new/installation construction activity and, in doing so, proposes that there exists multiple market structures in a given project. The purpose of this paper is to apply the SCP model to a different sector of construction activity, that is air conditioning maintenance and test de Valance's proposition concerning the existence of multiple market structures in a supply chain but this time to a built facility. The research method combines secondary data concerning the "Structure" component of the SCP model and primary data with regard to the "Conduct" and "Performance" parts of the SCP model. The results provide further support (beyond de Valance's analysis of new/installation activity) that a sector system approach using the SCP model is a more effective way to analyse market structures in construction activity. This paper also supports de Valance's proposition concerning the existence of multiple market structures in a supply chain to a project/facility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis focuses on the volatile and hygroscopic properties of mixed aerosol species. In particular, the influence organic species of varying solubility have upon seed aerosols. Aerosol studies were conducted at the Paul Scherrer Institut Laboratory for Atmospheric Chemistry (PSI-LAC, Villigen, Switzerland) and at the Queensland University of Technology International Laboratory for Air Quality and Health (QUT-ILAQH, Brisbane, Australia). The primary measurement tool employed in this program was the Volatilisation and Hygroscopicity Tandem Differential Mobility Analyser (VHTDMA - Johnson et al. 2004). This system was initially developed at QUT within the ILAQH and was completely re-developed as part of this project (see Section 1.4 for a description of this process). The new VHTDMA was deployed to the PSI-LAC where an analysis of the volatile and hygroscopic properties of ammonium sulphate seeds coated with organic species formed from the photo-oxidation of á-pinene was conducted. This investigation was driven by a desire to understand the influence of atmospherically prevalent organics upon water uptake by material with cloud forming capabilities. Of particular note from this campaign were observed influences of partially soluble organic coatings upon inorganic ammonium sulphate seeds above and below their deliquescence relative humidity (DRH). Above the DRH of the seed increasing the volume fraction of the organic component was shown to reduce the water uptake of the mixed particle. Below the DRH the organic was shown to activate the water uptake of the seed. This was the first time this effect had been observed for á-pinene derived SOA. In contrast with the simulated aerosols generated at the PSI-LAC a case study of the volatile and hygroscopic properties of diesel emissions was undertaken. During this stage of the project ternary nucleation was shown, for the first time, to be one of the processes involved in formation of diesel particulate matter. Furthermore, these particles were shown to be coated with a volatile hydrophobic material which prevented the water uptake of the highly hygroscopic material below. This result was a first and indicated that previous studies into the hygroscopicity of diesel emission had erroneously reported the particles to be hydrophobic. Both of these results contradict the previously upheld Zdanovksii-Stokes-Robinson (ZSR) additive rule for water uptake by mixed species. This is an important contribution as it adds to the weight of evidence that limits the validity of this rule.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the future we will have a detailed ecological model of the whole planet with capabilities to explore and predict the consequences of alternative futures. However, such a planetary eco-model will take time to develop, time to populate with data, and time to validate - time the planet doesn't have. In the interim, we can model the major concentrations of energy use and pollution - our cities - and connect them to form a "talking cities network". Such a networked city model would be much quicker to build and validate. And the advantage of this approach is that it is safer and more effective for us to interfere with the operation of our cities than to tamper directly with the behaviour of natural systems. Essentially, it could be thought of as providing the planet with a nervous system and would empower us to better develop and manage sustainable cities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the final report of project 2002-010 Component Life – A Delphi Approach to Life Prediction of Building Material Components. A Delphi survey has been conducted to provide expert opinion on the life of components in buildings. Thirty different components were surveyed with a range of materials, coatings, environments and failure considered. These components were chosen to be representative of a wider range of components in the same building microclimate. The survey included both service life (with and without maintenance) and aesthetic life, and time to first maintenance. It included marine, industrial, and benign environments, and covered both commercial and residential buildings. In order to obtain answers to this wide range of question, but still have a survey that could be completed in a reasonable time, the survey was broken into five sections: 1 External metal components – residential buildings. 2. Internal metal components – residential buildings. 3. External metal components – commercial buildings. 4. Internal metal components – commercial buildings. 5. Metal connectors in buildings.