130 resultados para Input-output analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

To identify current ED models of care and their impact on care quality, care effectiveness, and cost. A systematic search of key health databases (Medline, CINAHL, Cochrane, EMbase) was conducted to identify literature on ED models of care. Additionally, a focused review of the contents of 11 international and national emergency medicine, nursing and health economic journals (published between 2010 and 2013) was undertaken with snowball identification of references of the most recent and relevant papers. Articles published between 1998 and 2013 in the English language were included for initial review by three of the authors. Studies in underdeveloped countries and not addressing the objectives of the present study were excluded. Relevant details were extracted from the retrieved literature, and analysed for relevance and impact. The literature was synthesised around the study's main themes. Models described within the literature mainly focused on addressing issues at the input, throughput or output stages of ED care delivery. Models often varied to account for site specific characteristics (e.g. onsite inpatient units) or to suit staffing profiles (e.g. extended scope physiotherapist), ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Only a few studies conducted cost-effectiveness analysis of service models. Although various models of delivering emergency healthcare exist, further research is required in order to make accurate and reliable assessments of their safety, clinical effectiveness and cost-effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel 2×2 multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) testbed based on an Analog Devices AD9361 highly integrated radio frequency (RF) agile transceiver was specifically implemented for the purpose of estimating and analyzing MIMO-OFDM channel capacity in vehicle-to-infrastructure (V2I) environments using the 920 MHz industrial, scientific, and medical (ISM) band. We implemented two-dimensional discrete cosine transform-based filtering to reduce the channel estimation errors and show its effectiveness on our measurement results. We have also analyzed the effects of channel estimation error on the MIMO channel capacity by simulation. Three different scenarios of subcarrier spacing were investigated which correspond to IEEE 802.11p, Long-Term Evolution (LTE), and Digital Video Broadcasting Terrestrial (DVB-T)(2k) standards. An extensive MIMO-OFDM V2I channel measurement campaign was performed in a suburban environment. Analysis of the measured MIMO channel capacity results as a function of the transmitter-to-receiver (TX-RX) separation distance up to 250 m shows that the variance of the MIMO channel capacity is larger for the near-range line-of-sight (LOS) scenarios than for the long-range non-LOS cases, using a fixed receiver signal-to-noise ratio (SNR) criterion. We observed that the largest capacity values were achieved at LOS propagation despite the common assumption of a degenerated MIMO channel in LOS. We consider that this is due to the large angular spacing between MIMO subchannels which occurs when the receiver vehicle rooftop antennas pass by the fixed transmitter antennas at close range, causing MIMO subchannels to be orthogonal. In addition, analysis on the effects of different subcarrier spacings on MIMO-OFDM channel capacity showed negligible differences in mean channel capacity for the subcarrier spacing range investigated. Measured channels described in this paper are available on request.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study estimates the environmental efficiency of international listed firms in 10 worldwide sectors from 2007 to 2013 by applying an order-m method, a non-parametric approach based on free disposal hull with subsampling bootstrapping. Using a conventional output of gross profit and two conventional inputs of labor and capital, this study examines the order-m environmental efficiency accounting for the presence of each of 10 undesirable inputs/outputs and measures the shadow prices of each undesirable input and output. The results show that there is greater potential for the reduction of undesirable inputs rather than bad outputs. On average, total energy, electricity, or water usage has the potential to be reduced by 50%. The median shadow prices of undesirable inputs, however, are much higher than the surveyed representative market prices. Approximately 10% of the firms in the sample appear to be potential sellers or production reducers in terms of undesirable inputs/outputs, which implies that the price of each item at the current level has little impact on most of the firms. Moreover, this study shows that the environmental, social, and governance activities of a firm do not considerably affect environmental efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to present preliminary findings on measuring the technical efficiencies using Data Envelopment Analysis (DEA) in Malaysian Real Estate Investment Trusts (REITs) to determine the best practice for operations which include the asset allocation and scale size to improve the performance of Malaysian REITs. Variables identified as input and output will be assessed in this cross section analysis using the operational approach and Variable Return to Scale DEA (VRS-DEA) by focusing on Malaysian REITs for the year 2013. Islamic REITs have higher efficiency score as compared to the conventional REITs for both models. Diversified REITs are more efficient as compared to the specialised REIT using both models. For Model 1, the negative inefficient value is identified in the managerial inefficiency as compared to the scale inefficiency. This shows that inputs are not fully minimised to produce more outputs. However, when other expenses are considered as different input variables, the efficiency score becomes higher from 60.3% to 81.2%. In model 2, scale inefficiency produce greater inefficiency as compared to the managerial efficiency. The result suggests that Malaysian REITs have been operating at the wrong scale of operations as majority of the Malaysian REITs are operating at decreasing return to scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational fluid dynamics (CFD) and particle image velocimetry (PIV) are commonly used techniques to evaluate the flow characteristics in the development stage of blood pumps. CFD technique allows rapid change to pump parameters to optimize the pump performance without having to construct a costly prototype model. These techniques are used in the construction of a bi-ventricular assist device (BVAD) which combines the functions of LVAD and RVAD in a compact unit. The BVAD construction consists of two separate chambers with similar impellers, volutes, inlet and output sections. To achieve the required flow characteristics of an average flow rate of 5 l/min and different pressure heads (left – 100mmHg and right – 20mmHg), the impellers were set at different rotating speeds. From the CFD results, a six-blade impeller design was adopted for the development of the BVAD. It was also observed that the fluid can flow smoothly through the pump with minimum shear stress and area of stagnation which are related to haemolysis and thrombosis. Based on the compatible Reynolds number the flow through the model was calculated for the left and the right pumps. As it was not possible to have both the left and right chambers in the experimental model, the left and right pumps were tested separately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Up-to-date evidence about levels and trends in disease and injury incidence, prevalence, and years lived with disability (YLDs) is an essential input into global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013), we estimated these quantities for acute and chronic diseases and injuries for 188 countries between 1990 and 2013. Methods Estimates were calculated for disease and injury incidence, prevalence, and YLDs using GBD 2010 methods with some important refi nements. Results for incidence of acute disorders and prevalence of chronic disorders are new additions to the analysis. Key improvements include expansion to the cause and sequelae list, updated systematic reviews, use of detailed injury codes, improvements to the Bayesian meta-regression method (DisMod-MR), and use of severity splits for various causes. An index of data representativeness, showing data availability, was calculated for each cause and impairment during three periods globally and at the country level for 2013. In total, 35 620 distinct sources of data were used and documented to calculated estimates for 301 diseases and injuries and 2337 sequelae. The comorbidity simulation provides estimates for the number of sequelae, concurrently, by individuals by country, year, age, and sex. Disability weights were updated with the addition of new population-based survey data from four countries. Findings Disease and injury were highly prevalent; only a small fraction of individuals had no sequelae. Comorbidity rose substantially with age and in absolute terms from 1990 to 2013. Incidence of acute sequelae were predominantly infectious diseases and short-term injuries, with over 2 billion cases of upper respiratory infections and diarrhoeal disease episodes in 2013, with the notable exception of tooth pain due to permanent caries with more than 200 million incident cases in 2013. Conversely, leading chronic sequelae were largely attributable to non-communicable diseases, with prevalence estimates for asymptomatic permanent caries and tension-type headache of 2∙4 billion and 1∙6 billion, respectively. The distribution of the number of sequelae in populations varied widely across regions, with an expected relation between age and disease prevalence. YLDs for both sexes increased from 537∙6 million in 1990 to 764∙8 million in 2013 due to population growth and ageing, whereas the age-standardised rate decreased little from 114∙87 per 1000 people to 110∙31 per 1000 people between 1990 and 2013. Leading causes of YLDs included low back pain and major depressive disorder among the top ten causes of YLDs in every country. YLD rates per person, by major cause groups, indicated the main drivers of increases were due to musculoskeletal, mental, and substance use disorders, neurological disorders, and chronic respiratory diseases; however HIV/AIDS was a notable driver of increasing YLDs in sub-Saharan Africa. Also, the proportion of disability-adjusted life years due to YLDs increased globally from 21·1% in 1990 to 31·2% in 2013. Interpretation Ageing of the world’s population is leading to a substantial increase in the numbers of individuals with sequelae of diseases and injuries. Rates of YLDs are declining much more slowly than mortality rates. The non-fatal dimensions of disease and injury will require more and more attention from health systems. The transition to nonfatal outcomes as the dominant source of burden of disease is occurring rapidly outside of sub-Saharan Africa. Our results can guide future health initiatives through examination of epidemiological trends and a better understanding of variation across countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considering ultrasound propagation through complex composite media as an array of parallel sonic rays, a comparison of computer simulated prediction with experimental data has previously been reported for transmission mode (where one transducer serves as transmitter, the other as receiver) in a series of ten acrylic step-wedge samples, immersed in water, exhibiting varying degrees of transit time inhomogeneity. In this study, the same samples were used but in pulse-echo mode, where the same ultrasound transducer served as both transmitter and receiver, detecting both ‘primary’ (internal sample interface) and ‘secondary’ (external sample interface) echoes. A transit time spectrum (TTS) was derived, describing the proportion of sonic rays with a particular transit time. A computer simulation was performed to predict the transit time and amplitude of various echoes created, and compared with experimental data. Applying an amplitude-tolerance analysis, 91.7±3.7% of the simulated data was within ±1 standard deviation (STD) of the experimentally measured amplitude-time data. Correlation of predicted and experimental transit time spectra provided coefficients of determination (R2) ranging from 100.0% to 96.8% for the various samples tested. The results acquired from this study provide good evidence for the concept of parallel sonic rays. Further, deconvolution of experimental input and output signals has been shown to provide an effective method to identify echoes otherwise lost due to phase cancellation. Potential applications of pulse-echo ultrasound transit time spectroscopy (PE-UTTS) include improvement of ultrasound image fidelity by improving spatial resolution and reducing phase interference artefacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs vs. 0.18 μs standard deviation), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we describe our investigation of the cointegration and causal relationships between energy consumption and economic output in Australia over a period of five decades. The framework used in this paper is the single-sector aggregate production function, which is the first comprehensive approach used in an Australian study of this type to include energy, capital and labour as separate inputs of production. The empirical evidence points to a cointegration relationship between energy and output and implies that energy is an important variable in the cointegration space, as are conventional inputs capital and labour. We also find some evidence of bidirectional causality between GDP and energy use. Although the evidence of causality from energy use to GDP was relatively weak when using the thermal aggregate of energy use, once energy consumption was adjusted for energy quality, we found strong evidence of Granger causality from energy use to GDP in Australia over the investigated period. The results are robust, irrespective of the assumptions of linear trends in the cointegration models, and are applicable for different econometric approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.