932 resultados para Singleton bound
Resumo:
In this paper we investigate the distribution of the product of Rayleigh distributed random variables. Considering the Mellin-Barnes inversion formula and using the saddle point approach we obtain an upper bound for the product distribution. The accuracy of this tail-approximation increases as the number of random variables in the product increase.
Resumo:
Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.
Resumo:
In this paper, a hybrid smoothed finite element method (H-SFEM) is developed for solid mechanics problems by combining techniques of finite element method (FEM) and Node-based smoothed finite element method (NS-FEM) using a triangular mesh. A parameter is equipped into H-SFEM, and the strain field is further assumed to be the weighted average between compatible stains from FEM and smoothed strains from NS-FEM. We prove theoretically that the strain energy obtained from the H-SFEM solution lies in between those from the compatible FEM solution and the NS-FEM solution, which guarantees the convergence of H-SFEM. Intensive numerical studies are conducted to verify these theoretical results and show that (1) the upper and lower bound solutions can always be obtained by adjusting ; (2) there exists a preferable at which the H-SFEM can produce the ultrasonic accurate solution.
Resumo:
In a critical but sympathetic reading of Habermas’s work (1984, 1987a, 1987b, 2003), Luke Goode (2005) recently sought to rework his theory of deliberative democracy in an age of mediated and increasingly digital public spheres. Taking a different approach, Alan McKee (2005) challenged the culture- and class-bound strictures of Habermasian rationalism, instead pursuing a more radically pluralist account of postmodern public spheres. The editors of this special section of Media, Culture & Society invited us to discuss our differing approaches to the public sphere. Goode holds that the institutional bases of contemporary public spheres (political parties, educational institutions or public media) remain of critical importance, albeit in the context of a kaleidoscopic array of unofficial and informal micro-publics, both localized and de-territorialized. In contrast, McKee sustains a ‘hermeneutics of suspicion’ toward the official, hegemonic institutions of the public sphere since they tend to exclude and delegitimize discourses and practices that challenge their polite middle-class norms. McKee’s recent research has focused on sexual cultures, particularly among youth (McKee, 2011). Goode’s recent work has examined new social media spaces, particularly in relation to news and public debate (e.g. Goode, 2009; Goode et al., 2011). Consequently, our discussion turned to a domain which links our interests: after Goode discussed some of his recent research on (in)civility on YouTube as a new media public sphere, McKee challenged him to consider the case of pornographic websites modelled on social media sites.1 He identifies a greater degree of ‘civility’ in these pornographic sibling sites than on YouTube, requiring careful consideration of what constitutes a ‘public sphere’ in contemporary digital culture. Such sites represent an environment that shatters the opposition of public and private interest, affording public engagement on matters of the body, of intimacy, of gender politics, of pleasure and desire – said by many critics to be ruled out of court in Habermasian theory. Such environments also trouble traditional binaries between the cognitive and the affective, and between the performative and the deliberative. In what follows we explore the differences between our approaches in the form of a dialogue. As is often the case, our approaches seemed less at odds after engaging in conversation than may have initially appeared. But important differences of emphasis remain.
Resumo:
In 1980 Alltop produced a family of cubic phase sequences that nearly meet the Welch bound for maximum non-peak correlation magnitude. This family of sequences were shown by Wooters and Fields to be useful for quantum state tomography. Alltop’s construction used a function that is not planar, but whose difference function is planar. In this paper we show that Alltop type functions cannot exist in fields of characteristic 3 and that for a known class of planar functions, x^3 is the only Alltop type function.
Resumo:
The ultimate goal of an access control system is to allocate each user the precise level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting. On one hand employees need enough access to the organisation’s resources in order to perform their jobs and on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally, through carelessness or being socially engineered to give access to an adversary. This thesis investigates issues of existing approaches to access control in allocating optimal level of access to users and proposes solutions in the form of new access control models. These issues are most evident when uncertainty surrounding users’ access needs, incentive to misuse and accountability are considered, hence the title of the thesis. We first analyse access control in environments where the administrator is unable to identify the users who may need access to resources. To resolve this uncertainty an administrative model with delegation support is proposed. Further, a detailed technical enforcement mechanism is introduced to ensure delegated resources cannot be misused. Then we explicitly consider that users are self-interested and capable of misusing resources if they choose to. We propose a novel game theoretic access control model to reason about and influence the factors that may affect users’ incentive to misuse. Next we study access control in environments where neither users’ access needs can be predicted nor they can be held accountable for misuse. It is shown that by allocating budget to users, a virtual currency through which they can pay for the resources they deem necessary, the need for a precise pre-allocation of permissions can be relaxed. The budget also imposes an upper-bound on users’ ability to misuse. A generalised budget allocation function is proposed and it is shown that given the context information the optimal level of budget for users can always be numerically determined. Finally, Role Based Access Control (RBAC) model is analysed under the explicit assumption of administrators’ uncertainty about self-interested users’ access needs and their incentives to misuse. A novel Budget-oriented Role Based Access Control (B-RBAC) model is proposed. The new model introduces the notion of users’ behaviour into RBAC and provides means to influence users’ incentives. It is shown how RBAC policy can be used to individualise the cost of access to resources and also to determine users’ budget. The implementation overheads of B-RBAC is examined and several low-cost sub-models are proposed.
Resumo:
Trajectory basis Non-Rigid Structure From Motion (NRSFM) currently faces two problems: the limit of reconstructability and the need to tune the basis size for different sequences. This paper provides a novel theoretical bound on 3D reconstruction error, arguing that the existing definition of reconstructability is fundamentally flawed in that it fails to consider system condition. This insight motivates a novel strategy whereby the trajectory's response to a set of high-pass filters is minimised. The new approach eliminates the need to tune the basis size and is more efficient for long sequences. Additionally, the truncated DCT basis is shown to have a dual interpretation as a high-pass filter. The success of trajectory filter reconstruction is demonstrated quantitatively on synthetic projections of real motion capture sequences and qualitatively on real image sequences.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.
Resumo:
Automated process discovery techniques aim at extracting models from information system logs in order to shed light into the business processes supported by these systems. Existing techniques in this space are effective when applied to relatively small or regular logs, but otherwise generate large and spaghetti-like models. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. The result is a collection of process models -- each one representing a variant of the business process -- as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically by means of subprocess extraction. The proposed technique allows users to set a desired bound for the complexity of the produced models. Experiments on real-life logs show that the technique produces collections of models that are up to 64% smaller than those extracted under the same complexity bounds by applying existing trace clustering techniques.
Resumo:
GPS is a commonly used and convenient technology for determining absolute position in outdoor environments, but its high power consumption leads to rapid battery depletion in mobile devices. An obvious solution is to duty cycle the GPS module, which prolongs the device lifetime at the cost of increased position uncertainty while the GPS is off. This article addresses the trade-off between energy consumption and localization performance in a mobile sensor network application. The focus is on augmenting GPS location with more energy-efficient location sensors to bound position estimate uncertainty while GPS is off. Empirical GPS and radio contact data from a large-scale animal tracking deployment is used to model node mobility, radio performance, and GPS. Because GPS takes a considerable, and variable, time after powering up before it delivers a good position measurement, we model the GPS behaviour through empirical measurements of two GPS modules. These models are then used to explore duty cycling strategies for maintaining position uncertainty within specified bounds. We then explore the benefits of using short-range radio contact logging alongside GPS as an energy-inexpensive means of lowering uncertainty while the GPS is off, and we propose strategies that use RSSI ranging and GPS back-offs to further reduce energy consumption. Results show that our combined strategies can cut node energy consumption by one third while still meeting application-specific positioning criteria.
Resumo:
Fire safety design is important to eliminate the loss of property and lives during fire events. Gypsum plasterboard is widely used as a fire safety material in the building industry all over the world. It contains gypsum (CaSO4.2H2O) and Calcium Carbonate (CaCO3) and most importantly free and chemically bound water in its crystal structure. The dehydration of the gypsum and the decomposition of Calcium Carbonate absorb heat, which gives the gypsum plasterboard fire resistant qualities. Currently plasterboard manufacturers use additives such as vermiculite to overcome shrinkage of gypsum core and glass fibre to bridge shrinkage cracks and enhance the integrity of board during calcination and after the loss of paper facings in fires. Past research has also attempted to reduce the thermal conductivity of plasterboards using fillers. However, no research has been undertaken to enhance the specific heat of plasterboard and the points of dehydration using chemical additives and fillers. Hence detailed experimental studies of powdered samples of plasterboard mixed with chemical additives and fillers in varying proportions were conducted. These tests showed the enhancement of specific heat of plasterboard. Numerical models were also developed to investigate the thermal performance of enhanced plasterboards under standard fire conditions. The results showed that the use of these enhanced plasterboards in steel wall systems can significantly improve their fire performance. This paper presents the details of this research and the results that can be used to enhance the fire safety of steel wall systems commonly used in buildings.
Resumo:
This study explored the health needs, familial and social problems of Thai migrants in a local community in Brisbane, Australia. Five focus groups with Thai migrants were conducted. The qualitative data were examined using thematic content analysis that is specifically designed for focus group analysis. Four themes were identified: (1) positive experiences in Australia, (2) physical health problems, (3) mental health problems, and (4) familial and social health problems. This study revealed key health needs related to chronic disease and mental health, major barriers to health service use, such as language skills, and facilitating factors, such as the Thai Temple. We concluded that because the health needs, familial and social problems of Thai migrants were complex and culture bound, the development of health and community services for Thai migrants needs to take account of the ways in which Thai culture both negatively impacts health and offer positive solutions to problems.
Resumo:
The success of contemporary organizations depends on their ability to make appropriate decisions. Making appropriate decisions is inevitably bound to the availability and provision of relevant information. Information systems should be able to provide information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Syperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of specifying effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings. A short example illustrates the usefulness of a conceptual data modeling technique for the specification of information systems.
Resumo:
In a business environment, making the right decisions is vital for the success of a company. Making right decisions is inevitably bound to the availability and provision of relevant information. Information systems are supposed to be able to provide this information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Szyperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of developing effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings.
Resumo:
Hydrogels are hydrophilic, three dimensional polymers that imbibe large quantities of water while remaining insoluble in aqueous solutions due to chemical or physical cross-linking. The polymers swell in water or biological fluids, immobilizing the bioactive agent, leading to drug release in a well-defined specific manner. Thus the hydrogels’ elastic properties, swellability and biocompatibility make them excellent formulations for drug delivery. Currently, many drug potencies and therapeutic effects are limited or otherwise reduced because of the partial degradation that occurs before the administered drug reaches the desired site of action. On the other hand, sustained release medications release drugs continually, rather than providing relief of symptoms and protection solely when necessary. In fact, it would be much better if drugs could be administered in a manner that precisely matches physiological needs at desired times and at the desired site (site specific targeting). There is therefore an unmet need to develop controlled drug delivery systems especially for delivery of peptide and protein bound drugs. The purpose of this project is to produce hydrogels for structural drug delivery and time-dependent sustained release of drugs (bioactive agents). We use an innovative polymerisation strategy based on native chemical ligation (NCL) to covalently cross-link polymers to form hydrogels. When mixed in aqueous solution, four armed (polyethylene glycol) amine (PEG-4A) end functionalised with thioester and four branched Nterminal cysteine peptide dendrimers spontaneously conjugated to produce biomimetic hydrogels. These hydrogels showed superior resistance to shear stress compared to an equivalent PEG macromonomer system and were shown to be proteolytically degradable with concomitant release of a model payload molecule. This is the first report of a peptide dendrimers/PEG macromonomer approach to hydrogel production and opens up the prospect of facile hydrogel synthesis together with tailored payload release.