958 resultados para strong


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid growth in the number of online services leads to an increasing number of different digital identities each user needs to manage. As a result, many people feel overloaded with credentials, which in turn negatively impact their ability to manage them securely. Passwords are perhaps the most common type of credential used today. To avoid the tedious task of remembering difficult passwords, users often behave less securely by using low entropy and weak passwords. Weak passwords and bad password habits represent security threats to online services. Some solutions have been developed to eliminate the need for users to create and manage passwords. A typical solution is based on giving the user a hardware token that generates one-time-passwords, i.e. passwords for single session or transaction usage. Unfortunately, most of these solutions do not satisfy scalability and/or usability requirements, or they are simply insecure. In this paper, we propose a scalable OTP solution using mobile phones and based on trusted computing technology that combines enhanced usability with strong security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No-tillage (NT) management has been promoted as a practice capable of offsetting greenhouse gas (GHG) emissions because of its ability to sequester carbon in soils. However, true mitigation is only possible if the overall impact of NT adoption reduces the net global warming potential (GWP) determined by fluxes of the three major biogenic GHGs (i.e. CO2, N2O, and CH4). We compiled all available data of soil-derived GHG emission comparisons between conventional tilled (CT) and NT systems for humid and dry temperate climates. Newly converted NT systems increase GWP relative to CT practices, in both humid and dry climate regimes, and longer-term adoption (>10 years) only significantly reduces GWP in humid climates. Mean cumulative GWP over a 20-year period is also reduced under continuous NT in dry areas, but with a high degree of uncertainty. Emissions of N2O drive much of the trend in net GWP, suggesting improved nitrogen management is essential to realize the full benefit from carbon storage in the soil for purposes of global warming mitigation. Our results indicate a strong time dependency in the GHG mitigation potential of NT agriculture, demonstrating that GHG mitigation by adoption of NT is much more variable and complex than previously considered, and policy plans to reduce global warming through this land management practice need further scrutiny to ensure success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerous challenges remain in the successful clinical translation of cell-based therapies for musculoskeletal tissue repair, including the identification of an appropriate cell source and a viable cell delivery system. The aim of this study was to investigate the attachment, colonization, and osteogenic differentiation of two stem cell types, human mesenchymal stem cells (hMSCs) and human amniotic fluid stem (hAFS) cells, on electrospun nanofiber meshes. We demonstrate that nanofiber meshes are able to support these cell functions robustly, with both cell types demonstrating strong osteogenic potential. Differences in the kinetics of osteogenic differentiation were observed between hMSCs and hAFS cells, with the hAFS cells displaying a delayed alkaline phosphatase peak, but elevated mineral deposition, compared to hMSCs. We also compared the cell behavior on nanofiber meshes to that on tissue culture plastic, and observed that there is delayed initial attachment and proliferation on meshes, but enhanced mineralization at a later time point. Finally, cell-seeded nanofiber meshes were found to be effective in colonizing three-dimensional scaffolds in an in vitro system. This study provides support for the use of the nanofiber mesh as a model surface for cell culture in vitro, and a cell delivery vehicle for the repair of bone defects in vivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper uses a multivariate analysis to examine how countries‘ tax morale and institutional quality affect the shadow economy. The literature strongly emphasizes the quantitative importance of these factors in understanding the level of and changes in the shadow economy. Newly available data sources offer the unique opportunity to further illuminate a topic that has received increased attention. After controlling for a variety of potential factors, we find strong support that a higher tax morale and a higher institutional quality lead to a smaller shadow economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this reply we show that the Nüesch (2009) comment paper to our initial contribution (Torgler and Schmidt 2007) has several shortcomings. He suggests that professional soccer wages seem to buy talent rather than motivation. We therefore provide a larger set of talent proxies and estimations to check whether this assertion is correct. Our results indicate that his conclusion is problematic. We still observe a strong motivational effect, and in some cases the effect is even larger than the talent effect. A further key problem in Nüesch’s contribution is the fact that he neglects to consider the relevance of the relative salary situation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of appropriate financial incentives within construction projects can contribute to strong alignment of project stakeholder motivation with project goals. However, effective incentive system design can be a challenging task and takes skillful planning by client managers in the early stages of a project. In response to a lack of information currently available to construction clients in this area, this paper explores the features of a successful incentive system and identifies key learnings for client managers to consider when designing incentives. Our findings, based on data from a large Australian case study, suggest that key stakeholders place greater emphasis on the project management processes that support incentives than on the incentive itself. Further, contractors need adequate time and information to accurately estimate construction costs prior to their tender price submission to ensure cost-focused incentive goals remain achievable. Thus, client managers should be designing incentives as part of a supportive procurement strategy to maximize project stakeholder motivation and prevent goal misalignment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To compare subjective blur limits for cylinder and defocus. ---------- Method: Blur was induced with a deformable, adaptive-optics mirror when either the subjects’ own astigmatisms were corrected or when both astigmatisms and higher-order aberrations were corrected. Subjects were cyclopleged and had 5 mm artificial pupils. Black letter targets (0.1, 0.35 and 0.6 logMAR) were presented on white backgrounds. Results: For ten subjects, blur limits were approximately 50% greater for cylinder than for defocus (in diopters). While there were considerable effects of axis for individuals, overall this was not strong, with the 0° (or 180°) axis having about 20% greater limits than oblique axes. In a second experiment with text (equivalent in angle to N10 print at 40 cm distance), cylinder blur limits for 6 subjects were approximately 30% greater than those for defocus; this percentage was slightly smaller than for the three letters. Blur limits of the text were intermediate between those of 0.35 logMAR and 0.6 logMAR letters. Extensive blur limit measurements for one subject with single letters did not show expected interactions between target detail orientation and cylinder axis. ---------- Conclusion: Subjective blur limits for cylinder are 30%-50% greater than those for defocus, with the overall influence of cylinder axis being 20%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the interplay between individual values, espoused organisational values and the values of the organisational culture in practice in light of a recent Royal Commission in Queensland, Australia, which highlighted systematic failures in patient care. The lack of congruence among values at these levels impacts upon the ethical decision making of health managers. The presence of institutional ethics regimes such as the Public Sector Ethics Act 1994 (Qld) and agency codes of conduct are not sufficient to counteract the negative influence of informal codes of practice that undermine espoused organisational values and community standards. The ethical decision-making capacity of health care managers remains at the front line in the battle against unethical and unprofessional practice. What is known about the topic? Value congruence theory focusses on the conflicts between individual and organisational values. Congruence between individual values, espoused values and values expressed in everyday practice can only be achieved by ensuring that such shared values are an ever-present factor in managerial decision making. What does this paper add? The importance of value congruence in building and sustaining a healthy organisational culture is confirmed by the evidence presented in the Bundaberg Hospital Inquiry. The presence of strong individual values among staff and strong espoused values in line with community expectations and backed up by legislation and ethics regimes were not, in themselves, sufficient to ensure a healthy organisational culture and prevent unethical, and possibly illegal, behaviour. What are the implications for practitioners? Managers must incorporate ethics in decision making to establish and maintain the nexus between individual and organisational values that is a vital component of a healthy organisational culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2008 the Australian government decided to remove white blood cells from all blood products. This policy of universal leucodepletion was a change to the existing policy of supplying leucodepleted products to high risk patients only. The decision was made without strong information about the cost-effectiveness of universal leucodepletion. The aims for this policy analysis are to generate cost-effectiveness data about universal leucodepletion, and to add to our understanding of the role of evidence and the political reality of healthcare decision-making in Australia. The cost-effectiveness analysis revealed universal leucodepletion costs $398,943 to save one year of life. This exceeds the normal maximum threshold for Australia. We discuss this result within the context of how policy decisions are made about blood, and how it relates to the theory and process of policy making. We conclude that the absence of a strong voice for cost-effectiveness was an important omission in this decision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of what has been broadly labelled the knowledge economy has been such that, even in the absence of precise measurement, it is the undoubted dynamo of today’s global market, and an essential part of any global city. The socio-economic importance of knowledge production in a knowledge economy is clear, and it is an emerging social phenomenon and research agenda in geographical studies. Knowledge production, and where, how and by whom it is produced, is an urban phenomenon that is poorly understood in an era of strong urbanisation. This paper focuses on knowledge community precincts as the catalytic magnet infrastructures impacting on knowledge production in cities. The paper discusses the increasing importance of knowledge-based urban development within the paradigm of the knowledge economy, and the role of knowledge community precincts as instruments to seed the foundation of knowledge production in cities. This paper explores the knowledge based urban development, and particularly knowledge community precinct development, potentials of Sydney, Melbourne and Brisbane, and benchmarks this against that of Boston, Massachusetts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric ions are produced by many natural and anthropogenic sources and their concentrations vary widely between different environments. There is very little information on their concentrations in different types of urban environments, how they compare across these environments and their dominant sources. In this study, we measured airborne concentrations of small ions, particles and net particle charge at 32 different outdoor sites in and around a major city in Australia and identified the main ion sources. Sites were classified into seven groups as follows: park, woodland, city centre, residential, freeway, power lines and power substation. Generally, parks were situated away from ion sources and represented the urban background value of about 270 ions cm-3. Median concentrations at all other groups were significantly higher than in the parks. We show that motor vehicles and power transmission systems are two major ion sources in urban areas. Power lines and substations constituted strong unipolar sources, while motor vehicle exhaust constituted strong bipolar sources. The small ion concentration in urban residential areas was about 960 cm-3. At sites where ion sources were co-located with particle sources, ion concentrations were inhibited due to the ion-particle attachment process. These results improved our understanding on air ion distribution and its interaction with particles in the urban outdoor environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactive oxygen species (ROS) and related free radicals are considered to be key factors underpinning the various adverse health effects associated with exposure to ambient particulate matter. Therefore, measurement of ROS is a crucial factor for assessing the potential toxicity of particles. In this work, a novel profluorescent nitroxide, BPEAnit, was investigated as a probe for detecting particle-derived ROS. BPEAnit has a very low fluorescence emission due to inherent quenching by the nitroxide group, but upon radical trapping or redox activity, a strong fluorescence is observed. BPEAnit was tested for detection of ROS present in mainstream and sidestream cigarette smoke. In the case of mainstream cigarette smoke, there was a linear increase in fluorescence intensity with an increasing number of cigarette puffs, equivalent to an average of 101 nmol ROS per cigarette based on the number of moles of the probe reacted. Sidestream cigarette smoke sampled from an environmental chamber exposed BPEAnit to much lower concentrations of particles, but still resulted in a clearly detectible increase in fluorescence intensity with sampling time. It was calculated that the amount of ROS was equivalent to 50 ± 2 nmol per mg of particulate matter; however, this value decreased with ageing of the particles in the chamber. Overall, BPEAnit was shown to provide a sensitive response related to the oxidative capacity of the particulate matter. These findings present a good basis for employing the new BPEAnit probe for the investigation of particle-related ROS generated from cigarette smoke as well as from other combustion sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study reports the potential toxicological impact of particles produced during biomass combustion by an automatic pellet boiler and a traditional logwood stove under various combustion conditions using a novel profluorescent nitroxide probe BPEAnit. This probe is weakly fluorescent, but yields strong fluorescence emission upon radical trapping or redox activity. Samples were collected by bubbling aerosol through an impinger containing BPEAnit solution, followed by fluorescence measurement. The fluorescence of BPEAnit was measured for particles produced during various combustion phases, at the beginning of burning (cold start), stable combustion after refilling with the fuel (warm start) and poor burning conditions. For particles produced by the logwood stove under cold-start conditions significantly higher amounts of reactive species per unit of particulate mass were observed compared to emissions produced during a warm start. In addition, sampling of logwood burning emissions after passing through a thermodenuder at 250oC resulted in an 80-100% reduction of the fluorescence signal of BPEAnit probe, indicating that the majority of reactive species were semivolatile. Moreover, the amount of reactive species showed a strong correlation with the amount of particulate organic material. This indicates the importance of semivolatile organics in particle-related toxicity. Particle emissions from the pellet boiler, although of similar mass concentration, were not observed to lead to an increase in fluorescence signal during any of the combustion phases.