962 resultados para Reach.
Resumo:
The automotive industry has been the focus of digital human modeling (DHM) research and application for many years. In the highly competitive marketplace for personal transportation, the desire to improve the customer’s experience has driven extensive research in both the physical and cognitive interaction between the vehicle and its occupants. Human models provide vehicle designers with tools to view and analyze product interactions before the first prototypes are built, potentially improving the design while reducing cost and development time. The focus of DHM research and applications began with prediction and representation of static postures for purposes of driver workstation layout, including assessments of seat adjustment ranges and exterior vision. Now DHMs are used for seat design and assessment of driver reach and ingress/egress. DHMs and related simulation tools are expanding into the cognitive domain, with computational models of perception and motion, and into the dynamic domain with models of physical responses to ride and vibration. Moreover, DHMs are now widely used to analyze the ergonomics of vehicle assembly tasks. In this case, the analysis aims to determine whether workers can be expected to complete the tasks safely and with good quality. This preface provides a review of the literature to provide context for the nine new papers presented in this special issue.
Resumo:
Recent analyses of population data reveal that obesity rates continue to rise, and are projected to reach unprecedented levels over the next decade 1. Despite concerted efforts to impede obesity progression, as of today, weight loss and weight maintenance strategies remain at best partially successful endeavours. Regardless of the observation that weight loss strategies can produce significant weight loss 2 and substantial improvements of the determinants of the metabolic risk profile 3, 4, it is clear that actual weight loss tends to be lower than the anticipated weight loss, and most individuals who achieve weight loss will likely regain some weight 5 and even overshoot 6 their pre-intervention body weight. As such, an improved understanding of the factors that contribute to lower than expected weight loss, and poor weight maintenance would improve the effectiveness of weight loss interventions.
Resumo:
Microenterprise development programs (MEPs) have been recognised as a valuable way to help the poor engage in micro-businesses (Green et al., 2006; Vargas, 2000), presenting a way out of poverty (Choudhury et al., 2008; Strier, 2010). Concerns have been raised however, that the benefits of MEPs often don’t reach the extremely poor (Jones et al., 2004; Midgley, 2008; Mosley and Hulme, 1998; Nawaz, 2010; Pritchett, 2006). Balancing reach of these programs with depth is a challenging task. Targeting as many poor people as possible often results in MEPs focusing on the upper or middle poor, overlooking the most challenging group. As such, MEPs have been criticised for mission drift – losing sight of the organisation’s core purpose; assisting those more likely to succeed.
Resumo:
A fundamental prerequisite of population health research is the ability to establish an accurate denominator. This in turn requires that every individual in the study population is counted. However, this seemingly simple principle has become a point of conflict between researchers whose aim is to produce evidence of disparities in population health outcomes and governments whose policies promote(intentionally or not) inequalities that are the underlying causes of health disparities. Research into the health of asylum seekers is a case in point. There is a growing body of evidence documenting the adverse affects of recent changes in asylum-seeking legislation, including mandatory detention. However, much of this evidence has been dismissed by some governments as being unsound, biased and unscientific because, it is argued, evidence is derived from small samples or from case studies. Yet, it is the policies of governments that are the key barrier to the conduct of rigorous population health research on asylum seekers. In this paper, the authors discuss the challenges of counting asylum seekers and the limitations of data reported in some industrialized countries. They argue that the lack of accurate statistical data on asylum seekers has been an effective neo-conservative strategy for erasing the health inequalities in this vulnerable population, indeed a strategy that renders invisible this population. They describe some alternative strategies that may be used by researchers to obtain denominator data on hard-to-reach populations such as asylum seekers.
Resumo:
Information privacy requirements of patients and information requirements of healthcare providers (HCP) are competing concerns. Reaching a balance between these requirements have proven difficult but is crucial for the success of eHealth systems. The traditional approaches to information management have been preventive measures which either allow or deny access to information. We believe that this approach is inappropriate for a domain such as healthcare. We contend that introducing information accountability (IA) to eHealth systems can reach the aforementioned balance without the need for rigid information control. IA is a fairly new concept to computer science, hence; there are no unambiguously accepted principles as yet. But the concept delivers promising advantages to information management in a robust manner. Accountable-eHealth (AeH) systems are eHealth systems which use IA principles as the measure for privacy and information management. AeH systems face three main impediments; technological, social and ethical and legal. In this paper, we present the AeH model and focus on the legal aspects of AeH systems in Australia. We investigate current legislation available in Australia regarding health information management and identify future legal requirements if AeH systems are to be implemented in Australia.
Resumo:
An ambitious rendering of the digital future from a pioneer of media and cultural studies, a wise and witty take on a changing field, and our orientation to it Investigates the uses of multimedia by creative and productive citizen-consumers to provide new theories of communication that accommodate social media, participatory action, and user-creativity Leads the way for new interdisciplinary engagement with systems thinking, complexity and evolutionary sciences, and the convergence of cultural and economic values Analyzes the historical uses of multimedia from print, through broadcasting to the internet Combines conceptual innovation with historical erudition to present a high-level synthesis of ideas and detailed analysis of emergent forms and practices Features an international focus and global reach to provide a basis for students and researchers seeking broader perspectives
Resumo:
It is widely recognised that exposure to air pollutants affect pulmonary and lung dysfunction as well as a range of neurological and vascular disorders. The rapid increase of worldwide carbon emissions continues to compromise environmental sustainability whilst contributing to premature death. Moreover, the harms caused by air pollution have a more pernicious reach, such as being the major source of climate change and ‘natural disasters’, which reportedly kills millions of people each year (World Health Organization, 2012). The opening quotations tell a story of the UK government's complacency towards the devastation of toxic and contaminating air emissions. The above headlines greeted the British public earlier this year after its government was taken to the Court of Appeal for an appalling air pollution record that continues to cause the premature deaths of 30,000 British people each year at a health cost estimated at £20 billion per annum. This combined with pending legal proceedings against the UK government for air pollution violations by the European Commission, point to a Cameron government that prioritises hot air and profit margins over human lives. The UK's legal air pollution regimes are an industry dominated process that relies on negotiation and partnership between regulators and polluters. The entire model seeks to assist business compliance rather than punish corporate offenders. There is no language of ‘crime’ in relation to UK air pollution violations but rather a discourse of ‘exceedence’ (Walters, 2010). It is a regulatory system not premised on the ‘polluter pay’ principle but instead the ‘polluter profit’ principle.
Resumo:
This paper examines the case of a procurement auction for a single project, in which the breakdown of the winning bid into its component items determines the value of payments subsequently made to bidder as the work progresses. Unbalanced bidding, or bid skewing, involves the uneven distribution of mark-up among the component items in such a way as to attempt to derive increased benefit to the unbalancer but without involving any change in the total bid. One form of unbalanced bidding for example, termed Front Loading (FL), is thought to be widespread in practice. This involves overpricing the work items that occur early in the project and underpricing the work items that occur later in the project in order to enhance the bidder's cash flow. Naturally, auctioners attempt to protect themselves from the effects of unbalancing—typically reserving the right to reject a bid that has been detected as unbalanced. As a result, models have been developed to both unbalance bids and detect unbalanced bids but virtually nothing is known of their use, success or otherwise. This is of particular concern for the detection methods as, without testing, there is no way of knowing the extent to which unbalanced bids are remaining undetected or balanced bids are being falsely detected as unbalanced. This paper reports on a simulation study aimed at demonstrating the likely effects of unbalanced bid detection models in a deterministic environment involving FL unbalancing in a Texas DOT detection setting, in which bids are deemed to be unbalanced if an item exceeds a maximum (or fails to reach a minimum) ‘cut-off’ value determined by the Texas method. A proportion of bids are automatically and maximally unbalanced over a long series of simulated contract projects and the profits and detection rates of both the balancers and unbalancers are compared. The results show that, as expected, the balanced bids are often incorrectly detected as unbalanced, with the rate of (mis)detection increasing with the proportion of FL bidders in the auction. It is also shown that, while the profit for balanced bidders remains the same irrespective of the number of FL bidders involved, the FL bidder's profit increases with the greater proportion of FL bidders present in the auction. Sensitivity tests show the results to be generally robust, with (mis)detection rates increasing further when there are fewer bidders in the auction and when more data are averaged to determine the baseline value, but being smaller or larger with increased cut-off values and increased cost and estimate variability depending on the number of FL bidders involved. The FL bidder's expected benefit from unbalancing, on the other hand, increases, when there are fewer bidders in the auction. It also increases when the cut-off rate and discount rate is increased, when there is less variability in the costs and their estimates, and when less data are used in setting the baseline values.
Resumo:
In 1991, McNabb introduced the concept of mean action time (MAT) as a finite measure of the time required for a diffusive process to effectively reach steady state. Although this concept was initially adopted by others within the Australian and New Zealand applied mathematics community, it appears to have had little use outside this region until very recently, when in 2010 Berezhkovskii and coworkers rediscovered the concept of MAT in their study of morphogen gradient formation. All previous work in this area has been limited to studying single–species differential equations, such as the linear advection–diffusion–reaction equation. Here we generalise the concept of MAT by showing how the theory can be applied to coupled linear processes. We begin by studying coupled ordinary differential equations and extend our approach to coupled partial differential equations. Our new results have broad applications including the analysis of models describing coupled chemical decay and cell differentiation processes, amongst others.
Resumo:
The purpose of this paper is to consider how libraries support the development of community networks both physically and digitally. To do this, a case-study methodology was employed, including a combination of data about the library and qualitative interviews with library users considering their experience of the library. This paper proposes that libraries act as ‘third places’ spatially connecting people; libraries also build links with online media and play a critical role in inclusively connecting non-technology users with the information on the Internet and digital technology more generally. The paper establishes the value of libraries in the digital age and recommends that libraries actively seek ways to develop links between non-technology users and activity on the Internet. It addresses the need to reach these types of non-technology users in different ways. Further, it suggests that libraries utilise their positioning as third places to create broader community networks, to support local communities beyond existing users and beyond the library precinct.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
Passive air samplers (PAS) consisting of polyurethane foam (PUF) disks were deployed at 6 outdoor air monitoring stations in different land use categories (commercial, industrial, residential and semi-rural) to assess the spatial distribution of polybrominated diphenyl ethers (PBDEs) in the Brisbane airshed. Air monitoring sites covered an area of 1143 km2 and PAS were allowed to accumulate PBDEs in the city's airshed over three consecutive seasons commencing in the winter of 2008. The average sum of five (∑5) PBDEs (BDEs 28, 47, 99, 100 and 209) levels were highest at the commercial and industrial sites (12.7 ± 5.2 ng PUF−1), which were relatively close to the city center and were a factor of 8 times higher than residential and semi-rural sites located in outer Brisbane. To estimate the magnitude of the urban ‘plume’ an empirical exponential decay model was used to fit PAS data vs. distance from the CBD, with the best correlation observed when the particulate bound BDE-209 was not included (∑5-209) (r2 = 0.99), rather than ∑5 (r2 = 0.84). At 95% confidence intervals the model predicts that regardless of site characterization, ∑5-209 concentrations in a PAS sample taken between 4–10 km from the city centre would be half that from a sample taken from the city centre and reach a baseline or plateau (0.6 to 1.3 ng PUF−1), approximately 30 km from the CBD. The observed exponential decay in ∑5-209 levels over distance corresponded with Brisbane's decreasing population density (persons/km2) from the city center. The residual error associated with the model increased significantly when including BDE-209 levels, primarily due to the highest level (11.4 ± 1.8 ng PUF−1) being consistently detected at the industrial site, indicating a potential primary source at this site. Active air samples collected alongside the PAS at the industrial air monitoring site (B) indicated BDE-209 dominated congener composition and was entirely associated with the particulate phase. This study demonstrates that PAS are effective tools for monitoring citywide regional differences however, interpretation of spatial trends for POPs which are predominantly associated with the particulate phase such as BDE-209, may be restricted to identifying ‘hotspots’ rather than broad spatial trends.
Resumo:
To feel another person’s pulse is an intimate and physical interaction. In these prototypes we use near field communications to extend the tangible reach of our heart beat, so another person can feel our heart beat at a distance. The work is an initial experiment in near field haptic interaction, and is used to explore the quality of interactions resulting from feeling another persons pulse. The work takes the form of two feathered white gauntlets, to be worn on the fore arm. Each of the gauntlets contain a pulse sensor, radio transmitter and vibrator. The pulse of the wearer is transmitted to the other feathered gauntlet and transformed into haptic feedback. When there are two wearers, their heart beats are exchanged. To be felt by of each other without physical contact.
Resumo:
Many commentators have treated the internet as a site of democratic freedom and as a new kind of public sphere. While there are good reasons for optimism, like any social space digital space also has its dark side. Citizens and governments alike have expressed anxiety about cybercrime and cyber-security. In August 2011, the Australian government introduced legislation to give effect to Australia becoming a signatory to the European Convention on Cybercrime (2001). At the time of writing, that legislation is still before the Parliament. In this article, attention is given to how the legal and policy-making process enabling Australia to be compliant with the European Convention on Cybercrime came about. Among the motivations that informed both the development of the Convention in Europe and then the Australian exercise of legislating for compliance with it was a range of legitimate concerns about the impact that cybercrime can have on individuals and communities. This article makes the case that equal attention also needs to be given to ensuring that legislators and policy makers differentiate between legitimate security imperatives and any over-reach evident in the implementation of this legislation that affects rule of law principles, our capacity to engage in democratic practices, and our civic and human rights.
Resumo:
This paper presents a shared autonomy control scheme for a quadcopter that is suited for inspection of vertical infrastructure — tall man-made structures such as streetlights, electricity poles or the exterior surfaces of buildings. Current approaches to inspection of such structures is slow, expensive, and potentially hazardous. Low-cost aerial platforms with an ability to hover now have sufficient payload and endurance for this kind of task, but require significant human skill to fly. We develop a control architecture that enables synergy between the ground-based operator and the aerial inspection robot. An unskilled operator is assisted by onboard sensing and partial autonomy to safely fly the robot in close proximity to the structure. The operator uses their domain knowledge and problem solving skills to guide the robot in difficult to reach locations to inspect and assess the condition of the infrastructure. The operator commands the robot in a local task coordinate frame with limited degrees of freedom (DOF). For instance: up/down, left/right, toward/away with respect to the infrastructure. We therefore avoid problems of global mapping and navigation while providing an intuitive interface to the operator. We describe algorithms for pole detection, robot velocity estimation with respect to the pole, and position estimation in 3D space as well as the control algorithms and overall system architecture. We present initial results of shared autonomy of a quadrotor with respect to a vertical pole and robot performance is evaluated by comparing with motion capture data.