247 resultados para zero divisione


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose - Thermo-magnetic convection and heat transfer of paramagnetic fluid placed in a micro-gravity condition (g = 0) and under a uniform vertical gradient magnetic field in an open square cavity with three cold sidewalls have been studied numerically. Design/methodology/approach - This magnetic force is proportional to the magnetic susceptibility and the gradient of the square of the magnetic induction. The magnetic susceptibility is inversely proportional to the absolute temperature based on Curie’s law. Thermal convection of a paramagnetic fluid can therefore take place even in zero-gravity environment as a direct consequence of temperature differences occurring within the fluid due to a constant internal heat generation placed within a magnetic field gradient. Findings - Effects of magnetic Rayleigh number, Ra, Prandtl number, Pr, and paramagnetic fluid parameter, m, on the flow pattern and isotherms as well as on the heat absorption are presented graphically. It is found that the heat transfer rate is suppressed in increased of the magnetic Rayleigh number and the paramagnetic fluid parameter for the present investigation. Originality/value - It is possible to control the buoyancy force by using the super conducting magnet. To the best knowledge of the author no literature related to magnetic convection for this configuration is available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zero energy buildings (ZEB) and zero energy homes (ZEH) are a current hot topic globally for policy makers (what are the benefits and costs), designers (how do we design them), the construction industry (can we build them), marketing (will consumers buy them) and researchers (do they work and what are the implications). This paper presents initial findings from actual measured data from a 9 star (as built), off-ground detached family home constructed in south-east Queensland in 2008. The integrated systems approach to the design of the house is analysed in each of its three main goals: maximising the thermal performance of the building envelope, minimising energy demand whilst maintaining energy service levels, and implementing a multi-pronged low carbon approach to energy supply. The performance outcomes of each of these stages are evaluated against definitions of Net Zero Carbon / Net Zero Emissions (Site and Source) and Net Zero Energy (onsite generation vs primary energy imports). The paper will conclude with a summary of the multiple benefits of combining very high efficiency building envelopes with diverse energy management strategies: a robustness, resilience, affordability and autonomy not generally seen in housing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A zero-energy home (ZEH) is a residential dwelling that generates as much energy annually from onsite renewable sources, as it consumes in its operation. A positive energy home (PEH) generates more energy than it consumes. The key design and construction elements, and costs and benefits of such buildings, are the subject of increasing research globally. Approaching this topic from the perspective of the role of such homes in the planning and development ‘supply chain’, this paper presents the measured outcomes of a PEH and discusses urban design implications. Using twelve months of detailed performance data of an occupied sub-tropical home, the paper analyses the design approach and performance outcomes that enable it to be classified as ‘positive energy’. Second, it analyses both the urban design strategies that assisted the house in achieving its positive energy status, and the impacts of such housing on urban design and infrastructure. Third, the triple bottom line implications are discussed from the viewpoint of both the individual household and the broader community. The paper concludes with recommendations for research areas required to further underpin and quantify the role of ZEHs and PEHs in enabling and supporting the economic, social and ecological sustainability of urban developments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Purpose: To determine how high and low contrast visual acuities are affected by blur caused by crossed-cylinder lenses. Method: Crossed-cylinder lenses of power zero (no added lens), +0.12 DS/-0.25 DC, +0.25 DS/-0.50 DC and +0.37/-0.75 DC were placed over the correcting lenses of the right eyes of eight subjects. Negative cylinder axes used were 15-180 degrees in 15 degree step for the two higher crossed-cylinders and 30-180 degrees in 30 degree steps for the lowest crossed cylinder. Targets were single lines of letters based on the Bailey-Lovie chart. Successively smaller lines were read until the subject could not read any of the letters correctly. Two contrasts were used: high (100%) and low (10%). The screen luminance of 100 cd/m2, together with the room lighting, gave pupil sizes of 4.5 to 6 mm. Results: High contrast visual acuities were better than low contrast visual acuities by 0.1 to 0.2 log unit (1 to 2 chart lines) for the no added lens condition. Based on comparing the average of visual acuities for the 0.75 D crossed-cylinder with the best visual acuity for a given contrast and subject, the rates of change of visual acuity per unit blur strength were similar for high contrast (0.34± 0.05 logMAR/D) and low contrast (0.37± 0.09 logMAR/D). There were considerable asymmetry effects, with the average loss in visual acuity across the two contrasts and the 0.50D/0.75 D crossed-cylinders doubling between the 165± and 60± negative cylinder axes. The loss of visual acuity with 0.75 D crossed-cylinders was approximately twice times that occurring for defocus of the same blur strength. Conclusion: Small levels of crossed-cylinder blur (≤0.75D) produce losses in visual acuity that are dependent on the cylinder axis. 0.75 D crossed-cylinders produce losses in visual acuity that are twice those produced by defocus of the same blur strength.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to determine the effects of cryotherapy, in the form of cold water immersion, on knee joint position sense. Fourteen healthy volunteers, with no previous knee injury or pre-existing clinical condition, participated in this randomized cross-over trial. The intervention consisted of a 30-min immersion, to the level of the umbilicus, in either cold (14 ± 1°C) or tepid water(28 ± 1°C). Approximately one week later, in a randomized fashion, the volunteers completed the remaining immersion. Active ipsilateral limb repositioning sense of the right knee was measured, using weight-bearing and non-weight bearing assessments, employing video-recorded 3D motion analysis. These assessments were conducted immediately before and after a cold and tepid water immersion. No significant differences were found between treatments for the absolute (P = 0.29), relative (P = 0.21) or variable error (P = 0.86). The average effect size of the outcome measures was modest (range –0.49 to 0.9) and all the associated 95% confidence intervals for these effect sizes crossed zero. These results indicate that there is no evidence of an enhanced risk of injury, following a return to sporting activity, after cold water.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil organic carbon sequestration rates over 20 years based on the Intergovernmental Panel for Climate Change (IPCC) methodology were combined with local economic data to determine the potential for soil C sequestration in wheat-based production systems on the Indo-Gangetic Plain (IGP). The C sequestration potential of rice–wheat systems of India on conversion to no-tillage is estimated to be 44.1 Mt C over 20 years. Implementing no-tillage practices in maize–wheat and cotton–wheat production systems would yield an additional 6.6 Mt C. This offset is equivalent to 9.6% of India's annual greenhouse gas emissions (519 Mt C) from all sectors (excluding land use change and forestry), or less than one percent per annum. The economic analysis was summarized as carbon supply curves expressing the total additional C accumulated over 20 year for a price per tonne of carbon sequestered ranging from zero to USD 200. At a carbon price of USD 25 Mg C−1, 3 Mt C (7% of the soil C sequestration potential) could be sequestered over 20 years through the implementation of no-till cropping practices in rice–wheat systems of the Indian States of the IGP, increasing to 7.3 Mt C (17% of the soil C sequestration potential) at USD 50 Mg C−1. Maximum levels of sequestration could be attained with carbon prices approaching USD 200 Mg C−1 for the States of Bihar and Punjab. At this carbon price, a total of 34.7 Mt C (79% of the estimated C sequestration potential) could be sequestered over 20 years across the rice–wheat region of India, with Uttar Pradesh contributing 13.9 Mt C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this groundbreaking book, acclaimed sociologist and Pulitzer Prize finalist Elliott Currie draws on years of interviews to offer a profound investigation of what has gone wrong for so many “mainstream” American adolescents. Rejecting such predictable answers as TV violence, permissiveness, and inherent evil, Currie links this crisis to a pervasive “culture of exclusion” fostered by a society in which medications trump guidance and a punitive “zero tolerance” approach to adolescent misbehavior has become the norm. Broadening his inquiry, he dissects the changes in middle-class life that stratify the world into "winners" and "losers," imposing an extraordinarily harsh culture—and not just on kids. Vivid, compelling, and deeply empathetic, The Road to Whatever is a stark indictment of a society that has lost the will—or the capacity—to care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secrecy of decryption keys is an important pre-requisite for security of any encryption scheme and compromised private keys must be immediately replaced. \emph{Forward Security (FS)}, introduced to Public Key Encryption (PKE) by Canetti, Halevi, and Katz (Eurocrypt 2003), reduces damage from compromised keys by guaranteeing confidentiality of messages that were encrypted prior to the compromise event. The FS property was also shown to be achievable in (Hierarchical) Identity-Based Encryption (HIBE) by Yao, Fazio, Dodis, and Lysyanskaya (ACM CCS 2004). Yet, for emerging encryption techniques, offering flexible access control to encrypted data, by means of functional relationships between ciphertexts and decryption keys, FS protection was not known to exist.\smallskip In this paper we introduce FS to the powerful setting of \emph{Hierarchical Predicate Encryption (HPE)}, proposed by Okamoto and Takashima (Asiacrypt 2009). Anticipated applications of FS-HPE schemes can be found in searchable encryption and in fully private communication. Considering the dependencies amongst the concepts, our FS-HPE scheme implies forward-secure flavors of Predicate Encryption and (Hierarchical) Attribute-Based Encryption.\smallskip Our FS-HPE scheme guarantees forward security for plaintexts and for attributes that are hidden in HPE ciphertexts. It further allows delegation of decrypting abilities at any point in time, independent of FS time evolution. It realizes zero-inner-product predicates and is proven adaptively secure under standard assumptions. As the ``cross-product" approach taken in FS-HIBE is not directly applicable to the HPE setting, our construction resorts to techniques that are specific to existing HPE schemes and extends them with what can be seen as a reminiscent of binary tree encryption from FS-PKE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last twenty years, the use of open content licenses has become increasingly and surprisingly popular. The use of such licences challenges the traditional incentive-based model of exclusive rights under copyright. Instead of providing a means to charge for the use of particular works, what seems important is mitigating against potential personal harm to the author and, in some cases, preventing non-consensual commercial exploitation. It is interesting in this context to observe the primacy of what are essentially moral rights over the exclusionary economic rights. The core elements of common open content licences map somewhat closely to continental conceptions of the moral rights of authorship. Most obviously, almost all free software and free culture licences require attribution of authorship. More interestingly, there is a tension between social norms developed in free software communities and those that have emerged in the creative arts over integrity and commercial exploitation. For programmers interested in free software, licence terms that prohibit commercial use or modification are almost completely inconsistent with the ideological and utilitarian values that underpin the movement. For those in the creative industries, on the other hand, non-commercial terms and, to a lesser extent, terms that prohibit all but verbatim distribution continue to play an extremely important role in the sharing of copyright material. While prohibitions on commercial use often serve an economic imperative, there is also a certain personal interest for many creators in avoiding harmful exploitation of their expression – an interest that has sometimes been recognised as forming a component of the moral right of integrity. One particular continental moral right – the right of withdrawal – is present neither in Australian law or in any of the common open content licences. Despite some marked differences, both free software and free culture participants are using contractual methods to articulate the norms of permissible sharing. Legal enforcement is rare and often prohibitively expensive, and the various communities accordingly rely upon shared understandings of acceptable behaviour. The licences that are commonly used represent a formalised expression of these community norms and provide the theoretically enforceable legal baseline that lends them legitimacy. The core terms of these licences are designed primarily to alleviate risk in sharing and minimise transaction costs in sharing and using copyright expression. Importantly, however, the range of available licences reflect different optional balances in the norms of creating and sharing material. Generally, it is possible to see that, stemming particularly from the US, open content licences are fundamentally important in providing a set of normatively accepted copyright balances that reflect the interests sought to be protected through moral rights regimes. As the cost of creation, distribution, storage, and processing of expression continues to fall towards zero, there are increasing incentives to adopt open content licences to facilitate wide distribution and reuse of creative expression. Thinking of these protocols not only as reducing transaction costs but of setting normative principles of participation assists in conceptualising the role of open content licences and the continuing tensions that permeate modern copyright law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study proceeds from a central interest in the importance of systematically evaluating operational large-scale integrated information systems (IS) in organisations. The study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2009). The track espouses programmatic research having the principles of incrementalism, tenacity, holism and generalisability through replication and extension research strategies. Track efforts have yielded the bicameral IS-Impact measurement model; the ‘impact’ half includes Organisational-Impact and Individual-Impact dimensions; the ‘quality’ half includes System-Quality and Information-Quality dimensions. Akin to Gregor’s (2006) analytic theory, the ISImpact model is conceptualised as a formative, multidimensional index and is defined as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (Gable et al., 2008, p: 381). The study adopts the IS-Impact model (Gable, et al., 2008) as its core theory base. Prior work within the IS-Impact track has been consciously constrained to Financial IS for their homogeneity. This study adopts a context-extension strategy (Berthon et al., 2002) with the aim "to further validate and extend the IS-Impact measurement model in a new context - i.e. a different IS - Human Resources (HR)". The overarching research question is: "How can the impacts of large-scale integrated HR applications be effectively and efficiently benchmarked?" This managerial question (Cooper & Emory, 1995) decomposes into two more specific research questions – In the new HR context: (RQ1): "Is the IS-Impact model complete?" (RQ2): "Is the ISImpact model valid as a 1st-order formative, 2nd-order formative multidimensional construct?" The study adhered to the two-phase approach of Gable et al. (2008) to hypothesise and validate a measurement model. The initial ‘exploratory phase’ employed a zero base qualitative approach to re-instantiating the IS-Impact model in the HR context. The subsequent ‘confirmatory phase’ sought to validate the resultant hypothesised measurement model against newly gathered quantitative data. The unit of analysis for the study is the application, ‘ALESCO’, an integrated large-scale HR application implemented at Queensland University of Technology (QUT), a large Australian university (with approximately 40,000 students and 5000 staff). Target respondents of both study phases were ALESCO key-user-groups: strategic users, management users, operational users and technical users, who directly use ALESCO or its outputs. An open-ended, qualitative survey was employed in the exploratory phase, with the objective of exploring the completeness and applicability of the IS-Impact model’s dimensions and measures in the new context, and to conceptualise any resultant model changes to be operationalised in the confirmatory phase. Responses from 134 ALESCO users to the main survey question, "What do you consider have been the impacts of the ALESCO (HR) system in your division/department since its implementation?" were decomposed into 425 ‘impact citations.’ Citation mapping using a deductive (top-down) content analysis approach instantiated all dimensions and measures of the IS-Impact model, evidencing its content validity in the new context. Seeking to probe additional (perhaps negative) impacts; the survey included the additional open question "In your opinion, what can be done better to improve the ALESCO (HR) system?" Responses to this question decomposed into a further 107 citations which in the main did not map to IS-Impact, but rather coalesced around the concept of IS-Support. Deductively drawing from relevant literature, and working inductively from the unmapped citations, the new ‘IS-Support’ construct, including the four formative dimensions (i) training, (ii) documentation, (iii) assistance, and (iv) authorisation (each having reflective measures), was defined as: "a measure at a point in time, of the support, the [HR] information system key-user groups receive to increase their capabilities in utilising the system." Thus, a further goal of the study became validation of the IS-Support construct, suggesting the research question (RQ3): "Is IS-Support valid as a 1st-order reflective, 2nd-order formative multidimensional construct?" With the aim of validating IS-Impact within its nomological net (identification through structural relations), as in prior work, Satisfaction was hypothesised as its immediate consequence. The IS-Support construct having derived from a question intended to probe IS-Impacts, too was hypothesised as antecedent to Satisfaction, thereby suggesting the research question (RQ4): "What is the relative contribution of IS-Impact and IS-Support to Satisfaction?" With the goal of testing the above research questions, IS-Impact, IS-Support and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS) structural equation modelling employing 221 valid responses largely evidenced the validity of the commencing IS-Impact model in the HR context. ISSupport too was validated as operationalised (including 11 reflective measures of its 4 formative dimensions). IS-Support alone explained 36% of Satisfaction; IS-Impact alone 70%; in combination both explaining 71% with virtually all influence of ISSupport subsumed by IS-Impact. Key study contributions to research include: (1) validation of IS-Impact in the HR context, (2) validation of a newly conceptualised IS-Support construct as important antecedent of Satisfaction, and (3) validation of the redundancy of IS-Support when gauging IS-Impact. The study also makes valuable contributions to practice, the research track and the sponsoring organisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The health effects of environmental hazards are often examined using time series of the association between a daily response variable (e.g., death) and a daily level of exposure (e.g., temperature). Exposures are usually the average from a network of stations. This gives each station equal importance, and negates the opportunity for some stations to be better measures of exposure. We used a Bayesian hierarchical model that weighted stations using random variables between zero and one. We compared the weighted estimates to the standard model using data on health outcomes (deaths and hospital admissions) and exposures (air pollution and temperature) in Brisbane, Australia. The improvements in model fit were relatively small, and the estimated health effects of pollution were similar using either the standard or weighted estimates. Spatial weighted exposures would be probably more worthwhile when there is either greater spatial detail in the health outcome, or a greater spatial variation in exposure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mining environment, being complex, irregular and time varying, presents a challenging prospect for stereo vision. The objective is to produce a stereo vision sensor suited to close-range scenes consisting primarily of rocks. This sensor should be able to produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this investigation. A number of area based matching metrics have been implemented, including the SAD, SSD, NCC, and their zero-meaned versions. The NCC and the zero meaned SAD and SSD were found to produce the disparity maps with the highest proportion of valid matches. The plain SAD and SSD were the least computationally expensive, due to all their operations taking place in integer arithmetic, however, they were extremely sensitive to radiometric distortion. Non-parametric techniques for matching, in particular, the rank and the census transform, have also been investigated. The rank and census transforms were found to be robust with respect to radiometric distortion, as well as being able to produce disparity maps with a high proportion of valid matches. An additional advantage of both the rank and the census transform is their amenability to fast hardware implementation.