914 resultados para Strong Hyperbolicity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of appropriate financial incentives within construction projects can contribute to strong alignment of project stakeholder motivation with project goals. However, effective incentive system design can be a challenging task and takes skillful planning by client managers in the early stages of a project. In response to a lack of information currently available to construction clients in this area, this paper explores the features of a successful incentive system and identifies key learnings for client managers to consider when designing incentives. Our findings, based on data from a large Australian case study, suggest that key stakeholders place greater emphasis on the project management processes that support incentives than on the incentive itself. Further, contractors need adequate time and information to accurately estimate construction costs prior to their tender price submission to ensure cost-focused incentive goals remain achievable. Thus, client managers should be designing incentives as part of a supportive procurement strategy to maximize project stakeholder motivation and prevent goal misalignment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To compare subjective blur limits for cylinder and defocus. ---------- Method: Blur was induced with a deformable, adaptive-optics mirror when either the subjects’ own astigmatisms were corrected or when both astigmatisms and higher-order aberrations were corrected. Subjects were cyclopleged and had 5 mm artificial pupils. Black letter targets (0.1, 0.35 and 0.6 logMAR) were presented on white backgrounds. Results: For ten subjects, blur limits were approximately 50% greater for cylinder than for defocus (in diopters). While there were considerable effects of axis for individuals, overall this was not strong, with the 0° (or 180°) axis having about 20% greater limits than oblique axes. In a second experiment with text (equivalent in angle to N10 print at 40 cm distance), cylinder blur limits for 6 subjects were approximately 30% greater than those for defocus; this percentage was slightly smaller than for the three letters. Blur limits of the text were intermediate between those of 0.35 logMAR and 0.6 logMAR letters. Extensive blur limit measurements for one subject with single letters did not show expected interactions between target detail orientation and cylinder axis. ---------- Conclusion: Subjective blur limits for cylinder are 30%-50% greater than those for defocus, with the overall influence of cylinder axis being 20%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the interplay between individual values, espoused organisational values and the values of the organisational culture in practice in light of a recent Royal Commission in Queensland, Australia, which highlighted systematic failures in patient care. The lack of congruence among values at these levels impacts upon the ethical decision making of health managers. The presence of institutional ethics regimes such as the Public Sector Ethics Act 1994 (Qld) and agency codes of conduct are not sufficient to counteract the negative influence of informal codes of practice that undermine espoused organisational values and community standards. The ethical decision-making capacity of health care managers remains at the front line in the battle against unethical and unprofessional practice. What is known about the topic? Value congruence theory focusses on the conflicts between individual and organisational values. Congruence between individual values, espoused values and values expressed in everyday practice can only be achieved by ensuring that such shared values are an ever-present factor in managerial decision making. What does this paper add? The importance of value congruence in building and sustaining a healthy organisational culture is confirmed by the evidence presented in the Bundaberg Hospital Inquiry. The presence of strong individual values among staff and strong espoused values in line with community expectations and backed up by legislation and ethics regimes were not, in themselves, sufficient to ensure a healthy organisational culture and prevent unethical, and possibly illegal, behaviour. What are the implications for practitioners? Managers must incorporate ethics in decision making to establish and maintain the nexus between individual and organisational values that is a vital component of a healthy organisational culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2008 the Australian government decided to remove white blood cells from all blood products. This policy of universal leucodepletion was a change to the existing policy of supplying leucodepleted products to high risk patients only. The decision was made without strong information about the cost-effectiveness of universal leucodepletion. The aims for this policy analysis are to generate cost-effectiveness data about universal leucodepletion, and to add to our understanding of the role of evidence and the political reality of healthcare decision-making in Australia. The cost-effectiveness analysis revealed universal leucodepletion costs $398,943 to save one year of life. This exceeds the normal maximum threshold for Australia. We discuss this result within the context of how policy decisions are made about blood, and how it relates to the theory and process of policy making. We conclude that the absence of a strong voice for cost-effectiveness was an important omission in this decision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of what has been broadly labelled the knowledge economy has been such that, even in the absence of precise measurement, it is the undoubted dynamo of today’s global market, and an essential part of any global city. The socio-economic importance of knowledge production in a knowledge economy is clear, and it is an emerging social phenomenon and research agenda in geographical studies. Knowledge production, and where, how and by whom it is produced, is an urban phenomenon that is poorly understood in an era of strong urbanisation. This paper focuses on knowledge community precincts as the catalytic magnet infrastructures impacting on knowledge production in cities. The paper discusses the increasing importance of knowledge-based urban development within the paradigm of the knowledge economy, and the role of knowledge community precincts as instruments to seed the foundation of knowledge production in cities. This paper explores the knowledge based urban development, and particularly knowledge community precinct development, potentials of Sydney, Melbourne and Brisbane, and benchmarks this against that of Boston, Massachusetts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric ions are produced by many natural and anthropogenic sources and their concentrations vary widely between different environments. There is very little information on their concentrations in different types of urban environments, how they compare across these environments and their dominant sources. In this study, we measured airborne concentrations of small ions, particles and net particle charge at 32 different outdoor sites in and around a major city in Australia and identified the main ion sources. Sites were classified into seven groups as follows: park, woodland, city centre, residential, freeway, power lines and power substation. Generally, parks were situated away from ion sources and represented the urban background value of about 270 ions cm-3. Median concentrations at all other groups were significantly higher than in the parks. We show that motor vehicles and power transmission systems are two major ion sources in urban areas. Power lines and substations constituted strong unipolar sources, while motor vehicle exhaust constituted strong bipolar sources. The small ion concentration in urban residential areas was about 960 cm-3. At sites where ion sources were co-located with particle sources, ion concentrations were inhibited due to the ion-particle attachment process. These results improved our understanding on air ion distribution and its interaction with particles in the urban outdoor environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactive oxygen species (ROS) and related free radicals are considered to be key factors underpinning the various adverse health effects associated with exposure to ambient particulate matter. Therefore, measurement of ROS is a crucial factor for assessing the potential toxicity of particles. In this work, a novel profluorescent nitroxide, BPEAnit, was investigated as a probe for detecting particle-derived ROS. BPEAnit has a very low fluorescence emission due to inherent quenching by the nitroxide group, but upon radical trapping or redox activity, a strong fluorescence is observed. BPEAnit was tested for detection of ROS present in mainstream and sidestream cigarette smoke. In the case of mainstream cigarette smoke, there was a linear increase in fluorescence intensity with an increasing number of cigarette puffs, equivalent to an average of 101 nmol ROS per cigarette based on the number of moles of the probe reacted. Sidestream cigarette smoke sampled from an environmental chamber exposed BPEAnit to much lower concentrations of particles, but still resulted in a clearly detectible increase in fluorescence intensity with sampling time. It was calculated that the amount of ROS was equivalent to 50 ± 2 nmol per mg of particulate matter; however, this value decreased with ageing of the particles in the chamber. Overall, BPEAnit was shown to provide a sensitive response related to the oxidative capacity of the particulate matter. These findings present a good basis for employing the new BPEAnit probe for the investigation of particle-related ROS generated from cigarette smoke as well as from other combustion sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study reports the potential toxicological impact of particles produced during biomass combustion by an automatic pellet boiler and a traditional logwood stove under various combustion conditions using a novel profluorescent nitroxide probe BPEAnit. This probe is weakly fluorescent, but yields strong fluorescence emission upon radical trapping or redox activity. Samples were collected by bubbling aerosol through an impinger containing BPEAnit solution, followed by fluorescence measurement. The fluorescence of BPEAnit was measured for particles produced during various combustion phases, at the beginning of burning (cold start), stable combustion after refilling with the fuel (warm start) and poor burning conditions. For particles produced by the logwood stove under cold-start conditions significantly higher amounts of reactive species per unit of particulate mass were observed compared to emissions produced during a warm start. In addition, sampling of logwood burning emissions after passing through a thermodenuder at 250oC resulted in an 80-100% reduction of the fluorescence signal of BPEAnit probe, indicating that the majority of reactive species were semivolatile. Moreover, the amount of reactive species showed a strong correlation with the amount of particulate organic material. This indicates the importance of semivolatile organics in particle-related toxicity. Particle emissions from the pellet boiler, although of similar mass concentration, were not observed to lead to an increase in fluorescence signal during any of the combustion phases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates how to interface the wireless application protocol (WAP) architecture to the SCADA system running distributed network protocol (DNP) in a power process plant. DNP is a well-developed protocol to be applied in the supervisory control and data acquisition (SCADA) system but the system control centre and remote terminal units (RTUs) are presently connected through a local area network. The conditions in a process plant are harsh and the site is remote. Resources for data communication are difficult to obtain under these conditions, thus, a wireless channel communication through a mobile phone is practical and efficient in a process plant environment. The mobile communication industries and the public have a strong interest in the WAP technology application in mobile phone networks and the WAP application programming interface (API) in power industry applications is one area that requires extensive investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Short-term traffic flow data is characterized by rapid and dramatic fluctuations. It reflects the nature of the frequent congestion in the lane, which shows a strong nonlinear feature. Traffic state estimation based on the data gained by electronic sensors is critical for much intelligent traffic management and the traffic control. In this paper, a solution to freeway traffic estimation in Beijing is proposed using a particle filter, based on macroscopic traffic flow model, which estimates both traffic density and speed.Particle filter is a nonlinear prediction method, which has obvious advantages for traffic flows prediction. However, with the increase of sampling period, the volatility of the traffic state curve will be much dramatic. Therefore, the prediction accuracy will be affected and difficulty of forecasting is raised. In this paper, particle filter model is applied to estimate the short-term traffic flow. Numerical study is conducted based on the Beijing freeway data with the sampling period of 2 min. The relatively high accuracy of the results indicates the superiority of the proposed model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review all journal articles based on “PSED-type” research, i.e., longitudinal, empirical studies of large probability samples of on-going, business start-up efforts. We conclude that the research stream has yielded interesting findings; sometimes by confirming prior research with a less bias-prone methodology and at other times by challenging whether prior conclusions are valid for the early stages of venture development. Most importantly, the research has addressed new, process-related research questions that prior research has shunned or been unable to study in a rigorous manner. The research has revealed an enormous and fascinating variability in new venture creation that also makes it challenging to arrive at broadly valid generalizations. An analysis of the findings across studies as well as an examination of those studies that have been relatively more successful at explaining outcomes give good guidance regarding what is required in order to achieve strong and credible results. We compile and present such advice to users of existing data sets and designers of new projects in the following areas: Statistically representative and/or theoretically relevant sampling; Level of analysis issues; Dealing with process heterogeneity; Dealing with other heterogeneity issues, and Choice and interpretation of dependent variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technologies employed for the preparation of conventional tissue engineering scaffolds restrict the materials choice and the extent to which the architecture can be designed. Here we show the versatility of stereolithography with respect to materials and freedom of design. Porous scaffolds are designed with computer software and built with either a poly(d,l-lactide)-based resin or a poly(d,l-lactide-co-ε-caprolactone)-based resin. Characterisation of the scaffolds by micro-computed tomography shows excellent reproduction of the designs. The mechanical properties are evaluated in compression, and show good agreement with finite element predictions. The mechanical properties of scaffolds can be controlled by the combination of material and scaffold pore architecture. The presented technology and materials enable an accurate preparation of tissue engineering scaffolds with a large freedom of design, and properties ranging from rigid and strong to highly flexible and elastic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasingly, celebrities appear not only as endorsers for products but are apparently engaged in entrepreneurial roles as initiators, owners and perhaps even managers in the ventures that market the products they promote. Despite being extensively referred to in popular media, scholars have been slow to recognise the importance of this new phenomenon. This thesis argues theoretically and shows empirically that celebrity entrepreneurs are more effective communicators than typical celebrity endorsers because of their increased engagement with ventures. I theorise that greater engagement increases the celebrity‘s emotional involvement as perceived by consumers. This is an endorser quality thus far neglected in the marketing communications literature. In turn, emotional involvement, much like the empirically established dimensions trustworthiness, expertise and attractiveness, should affect traditional outcome variables such as attitude towards the advertisement and brand. On the downside, increases in celebrity engagement may lead to relatively stronger and worsening changes in attitudes towards the brand if and when negative information about the celebrity is revealed. A series of eight experiments was conducted on 781 Swedish and Baltic students and 151 Swedish retirees. Though there were nuanced differences and additional complexities in each experiment, participants‘ reactions to advertisements containing a celebrity portrayed as a typical endorser or entrepreneur were recorded. The overall results of these experiments suggest that emotional involvement can be successfully operationalised as distinct from variables previously known to influence communication effectiveness. In addition, emotional involvement has positive effects on attitudes toward the advertisement and brand that are as strong as the predictors traditionally applied in the marketing communications literature. Moreover, the celebrity entrepreneur condition in the experimental manipulation consistently led to an increase in emotional involvement and to a lesser extent trustworthiness, but not expertise and attractiveness. Finally, negative celebrity information led to a change in participants‘ attitudes towards the brand which were more strongly negative for celebrity entrepreneurs than celebrity endorsers. In addition, the effect of negative celebrity information on a company‘s brand is worse when they support the celebrity rather than fire them. However, this effect did not appear to interact with the celebrity‘s purported engagement.