989 resultados para Shopping-time
Resumo:
Background The incidence of obesity amongst patients presenting for elective Total Hip Arthroplasty (THA) has increased in the last decade and the relationship between obesity and the need for joint replacement has been demonstrated. This study evaluates the effects of morbid obesity on outcomes following primary THA by comparing short-term outcomes in THA between a morbidly obese (BMI ≥40) and a normal weight (BMI 18.5 - <25) cohort at our institution between January 2003 and December 2010. Methods Thirty-nine patients included in the morbidly obese group were compared with 186 in the normal weight group. Operative time, length of stay, complications, readmission and length of readmission were compared. Results Operative time was increased in the morbidly obese group at 122 minutes compared with 100 minutes (p=0.002). Post-operatively there was an increased 30-day readmission rate related to surgery of 12.8% associated with BMI ≥40 compared with 2.7% (p= 0.005) as well as a 5.1 fold increase in surgery related readmitted bed days - 0.32 bed days per patient for normal weight compared with 1.64 per patient for the morbidly obese (p=0.026). Conclusion Morbidly obese patients present a technical challenge and likely this and the resultant complications are underestimated. More work needs to be performed in order to enable suitable allocation of resources.
Resumo:
Arguments arising from quantum mechanics and gravitation theory as well as from string theory, indicate that the description of space-time as a continuous manifold is not adequate at very short distances. An important candidate for the description of space-time at such scales is provided by noncommutative space-time where the coordinates are promoted to noncommuting operators. Thus, the study of quantum field theory in noncommutative space-time provides an interesting interface where ordinary field theoretic tools can be used to study the properties of quantum spacetime. The three original publications in this thesis encompass various aspects in the still developing area of noncommutative quantum field theory, ranging from fundamental concepts to model building. One of the key features of noncommutative space-time is the apparent loss of Lorentz invariance that has been addressed in different ways in the literature. One recently developed approach is to eliminate the Lorentz violating effects by integrating over the parameter of noncommutativity. Fundamental properties of such theories are investigated in this thesis. Another issue addressed is model building, which is difficult in the noncommutative setting due to severe restrictions on the possible gauge symmetries imposed by the noncommutativity of the space-time. Possible ways to relieve these restrictions are investigated and applied and a noncommutative version of the Minimal Supersymmetric Standard Model is presented. While putting the results obtained in the three original publications into their proper context, the introductory part of this thesis aims to provide an overview of the present situation in the field.
Resumo:
We consider a scenario in which a wireless sensor network is formed by randomly deploying n sensors to measure some spatial function over a field, with the objective of computing a function of the measurements and communicating it to an operator station. We restrict ourselves to the class of type-threshold functions (as defined in the work of Giridhar and Kumar, 2005), of which max, min, and indicator functions are important examples: our discussions are couched in terms of the max function. We view the problem as one of message-passing distributed computation over a geometric random graph. The network is assumed to be synchronous, and the sensors synchronously measure values and then collaborate to compute and deliver the function computed with these values to the operator station. Computation algorithms differ in (1) the communication topology assumed and (2) the messages that the nodes need to exchange in order to carry out the computation. The focus of our paper is to establish (in probability) scaling laws for the time and energy complexity of the distributed function computation over random wireless networks, under the assumption of centralized contention-free scheduling of packet transmissions. First, without any constraint on the computation algorithm, we establish scaling laws for the computation time and energy expenditure for one-time maximum computation. We show that for an optimal algorithm, the computation time and energy expenditure scale, respectively, as Theta(radicn/log n) and Theta(n) asymptotically as the number of sensors n rarr infin. Second, we analyze the performance of three specific computation algorithms that may be used in specific practical situations, namely, the tree algorithm, multihop transmission, and the Ripple algorithm (a type of gossip algorithm), and obtain scaling laws for the computation time and energy expenditure as n rarr infin. In particular, we show that the computation time for these algorithms scales as Theta(radicn/lo- g n), Theta(n), and Theta(radicn log n), respectively, whereas the energy expended scales as , Theta(n), Theta(radicn/log n), and Theta(radicn log n), respectively. Finally, simulation results are provided to show that our analysis indeed captures the correct scaling. The simulations also yield estimates of the constant multipliers in the scaling laws. Our analyses throughout assume a centralized optimal scheduler, and hence, our results can be viewed as providing bounds for the performance with practical distributed schedulers.
Resumo:
Child sexual abuse is widespread and difficult to detect. To enhance case identification, many societies have enacted mandatory reporting laws requiring designated professionals, most often police, teachers, doctors and nurses, to report suspected cases to government child welfare agencies. Little research has explored the effects of introducing a reporting law on the number of reports made, and the outcomes of those reports. This study explored the impact of a new legislative mandatory reporting duty for child sexual abuse in the State of Western Australia over seven years. We analysed data about numbers and outcomes of reports by mandated reporters, for periods before the law (2006-08) and after the law (2009-12). Results indicate that the number of reports by mandated reporters of suspected child sexual abuse increased by a factor of 3.7, from an annual mean of 662 in the three year pre-law period to 2448 in the four year post-law period. The increase in the first two post-law years was contextually and statistically significant. Report numbers stabilised in 2010-12, at one report per 210 children. The number of investigated reports increased threefold, from an annual mean of 451 in the pre-law period to 1363 in the post-law period. Significant decline in the proportion of mandated reports that were investigated in the first two post-law years suggested the new level of reporting and investigative need exceeded what was anticipated. However, a subsequent significant increase restored the pre-law proportion, suggesting systemic adaptive capacity. The number of substantiated investigations doubled, from an annual mean of 160 in the pre-law period to 327 in the post-law period, indicating twice as many sexually abused children were being identified.
Resumo:
Instability in conventional haptic rendering destroys the perception of rigid objects in virtual environments. Inherent limitations in the conventional haptic loop restrict the maximum stiffness that can be rendered. In this paper we present a method to render virtual walls that are much stiffer than those achieved by conventional techniques. By removing the conventional digital haptic loop and replacing it with a part-continuous and part-discrete time hybrid haptic loop, we were able to render stiffer walls. The control loop is implemented as a combinational logic circuit on an field-programmable gate array. We compared the performance of the conventional haptic loop and our hybrid haptic loop on the same haptic device, and present mathematical analysis to show the limit of stability of our device. Our hybrid method removes the computer-intensive haptic loop from the CPU-this can free a significant amount of resources that can be used for other purposes such as graphical rendering and physics modeling. It is our hope that, in the future, similar designs will lead to a haptics processing unit (HPU).
Resumo:
Polymer-DNA conjugates in which one nucleic acid strand contains fluorine-substituted nucleobases have been prepared and characterised. The efficacy of these novel F-19 nucleic acid-polymer conjugates as sensitive and selective in vitro reporters of DNA binding events is demonstrated through a number of rapid-acquisition MR sequences. The conjugates respond readily and in a sequence specific manner to external target oligonucleotide sequences by changes in hybridisation. In turn, these structural changes in polymer-nucleotide conjugates translate into responses which are detectable in fluorine relaxation and diffusion switches, and which can be monitored by in vitro Spin Echo and DOSY NMR spectroscopy. Although complementary to conventional FRET methods, the excellent diagnostic properties of fluorine nuclei make this approach a versatile and sensitive probe of molecular structure and conformation in polymeric assemblies.
Resumo:
Measurable electrical signal is generated when a gas flows over a variety of solids, including doped semiconductors, even at the modest speed of a few meters per second. The underlying mechanism is an interesting interplay of Bernoulli's principle and the Seebeck effect. The electrical signal depends on the square of Mach number (M) and is proportional to the Seebeck coefficient (S) of the solids. Here we present experimental estimate of the response time of the signal rise and fall process, i.e. how fast the semiconductor materials respond to a steady flow as soon as it is set on or off. A theoretical model is also presented to understand the process and the dependence of the response time on the nature and physical dimensions of the semiconductor material used and they are compared with the experimental observations. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
It is known that by employing space-time-frequency codes (STFCs) to frequency selective MIMO-OFDM systems, all the three diversity viz spatial, temporal and multipath can be exploited. There exists space-time-frequency block codes (STFBCs) designed using orthogonal designs with constellation precoder to get full diversity (Z.Liu, Y.Xin and G.Giannakis IEEE Trans. Signal Processing, Oct. 2002). Since orthogonal designs of rate one exists only for two transmit antennas, for more than two transmit antennas STFBCs of rate-one and full-diversity cannot be constructed using orthogonal designs. This paper presents a STFBC scheme of rate one for four transmit antennas designed using quasi-orthogonal designs along with co-ordinate interleaved orthogonal designs (Zafar Ali Khan and B. Sundar Rajan Proc: ISIT 2002). Conditions on the signal sets that give full-diversity are identified. Simulation results are presented to show the superiority of our codes over the existing ones.
Resumo:
The problem of constructing space-time (ST) block codes over a fixed, desired signal constellation is considered. In this situation, there is a tradeoff between the transmission rate as measured in constellation symbols per channel use and the transmit diversity gain achieved by the code. The transmit diversity is a measure of the rate of polynomial decay of pairwise error probability of the code with increase in the signal-to-noise ratio (SNR). In the setting of a quasi-static channel model, let n(t) denote the number of transmit antennas and T the block interval. For any n(t) <= T, a unified construction of (n(t) x T) ST codes is provided here, for a class of signal constellations that includes the familiar pulse-amplitude (PAM), quadrature-amplitude (QAM), and 2(K)-ary phase-shift-keying (PSK) modulations as special cases. The construction is optimal as measured by the rate-diversity tradeoff and can achieve any given integer point on the rate-diversity tradeoff curve. An estimate of the coding gain realized is given. Other results presented here include i) an extension of the optimal unified construction to the multiple fading block case, ii) a version of the optimal unified construction in which the underlying binary block codes are replaced by trellis codes, iii) the providing of a linear dispersion form for the underlying binary block codes, iv) a Gray-mapped version of the unified construction, and v) a generalization of construction of the S-ary case corresponding to constellations of size S-K. Items ii) and iii) are aimed at simplifying the decoding of this class of ST codes.
Resumo:
In this paper, we present an analysis for the bit error rate (BER) performance of space-time block codes (STBC) from generalized complex orthogonal designs for M-PSK modulation. In STBCs from complex orthogonal designs (COD), the norms of the column vectors are the same (e.g., Alamouti code). However, in generalized COD (GCOD), the norms of the column vectors may not necessarily be the same (e.g., the rate-3/5 and rate-7/11 codes by Su and Xia in [1]). STBCs from GCOD are of interest because of the high rates that they can achieve (in [2], it has been shown that the maximum achievable rate for STBCs from GCOD is bounded by 4/5). While the BER performance of STBCs: from COD (e.g., Alamouti code) can be simply obtained from existing analytical expressions for receive diversity with the same diversity order by appropriately scaling the SNR, this can not be done for STBCs from GCOD (because of the unequal norms of the column vectors). Our contribution in this paper is that we derive analytical expressions for the BER performance of any STBC from GCOD. Our BER analysis for the GCOD captures the performance of STBCs from COD as special cases. We validate our results with two STBCs from GCOD reported by Su and Xia in [1], for 5 and 6 transmit antennas (G(5) and G(6) in [1]) with rates 7/11 and 3/5, respectively.
Resumo:
This paper addresses the problem of detecting and resolving conflicts due to timing constraints imposed by features in real-time and hybrid systems. We consider systems composed of a base system with multiple features or controllers, each of which independently advise the system on how to react to input events so as to conform to their individual specifications. We propose a methodology for developing such systems in a modular manner based on the notion of conflict-tolerant features that are designed to continue offering advice even when their advice has been overridden in the past. We give a simple priority-based scheme forcomposing such features. This guarantees the maximal use of each feature. We provide a formal framework for specifying such features, and a compositional technique for verifying systems developed in this framework.
Resumo:
This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.
Resumo:
In 2015, Victoria passed laws removing the time limit in which a survivor of child sexual abuse can commence a civil claim for personal injury. The law applies also to physical abuse, and to psychological injury arising from those forms of abuse. In 2016, New South Wales made almost identical legal reforms. These reforms were partly motivated by the recommendations of inquiries into institutional child abuse. Of particular relevance is that the Australian Royal Commission Into Institutional Responses to Child Sexual Abuse recommended in 2015 that all States and Territories remove their time limits for civil claims. This presentation explores the problems with standard time limits when applied to child sexual abuse cases (whether occurring within or beyond institutions), the scientific, ethical and legal justifications for lifting the time limits, and solutions for future law reform.
Resumo:
The problem of scheduling divisible loads in distributed computing systems, in presence of processor release time is considered. The objective is to find the optimal sequence of load distribution and the optimal load fractions assigned to each processor in the system such that the processing time of the entire processing load is a minimum. This is a difficult combinatorial optimization problem and hence genetic algorithms approach is presented for its solution.
Resumo:
Passive wavelength/time fiber-optic code division multiple access (WIT FO-CDMA) network is a viable option for highspeed access networks. Constructions of 2-D codes, suitable for incoherent WIT FO-CDMA, have been proposed to reduce the time spread of the 1-D sequences. The 2-D constructions can be broadly classified as 1) hybrid codes and 2) matrix codes. In our earlier work [141, we had proposed a new family of wavelength/time multiple-pulses-per-row (W/T MPR) matrix codes which have good cardinality, spectral efficiency and at the same time have the lowest off-peak autocorrelation and cross-correlation values equal to unity. In this paper we propose an architecture for a WIT MPR FO-CDAM network designed using the presently available devices and technology. A complete FO-CDMA network of ten users is simulated, for various number of simultaneous users and shown that 0 --> 1 errors can occur only when the number of interfering users is at least equal to the threshold value.