803 resultados para preference-based measures
Resumo:
This investigation compares two different methodologies for calculating the national cost of epilepsy: provider-based survey method (PBSM) and the patient-based medical charts and billing method (PBMC&BM). The PBSM uses the National Hospital Discharge Survey (NHDS), the National Hospital Ambulatory Medical Care Survey (NHAMCS) and the National Ambulatory Medical Care Survey (NAMCS) as the sources of utilization. The PBMC&BM uses patient data, charts and billings, to determine utilization rates for specific components of hospital, physician and drug prescriptions. ^ The 1995 hospital and physician cost of epilepsy is estimated to be $722 million using the PBSM and $1,058 million using the PBMC&BM. The difference of $336 million results from $136 million difference in utilization and $200 million difference in unit cost. ^ Utilization. The utilization difference of $136 million is composed of an inpatient variation of $129 million, $100 million hospital and $29 million physician, and an ambulatory variation of $7 million. The $100 million hospital variance is attributed to inclusion of febrile seizures in the PBSM, $−79 million, and the exclusion of admissions attributed to epilepsy, $179 million. The former suggests that the diagnostic codes used in the NHDS may not properly match the current definition of epilepsy as used in the PBMC&BM. The latter suggests NHDS errors in the attribution of an admission to the principal diagnosis. ^ The $29 million variance in inpatient physician utilization is the result of different per-day-of-care physician visit rates, 1.3 for the PBMC&BM versus 1.0 for the PBSM. The absence of visit frequency measures in the NHDS affects the internal validity of the PBSM estimate and requires the investigator to make conservative assumptions. ^ The remaining ambulatory resource utilization variance is $7 million. Of this amount, $22 million is the result of an underestimate of ancillaries in the NHAMCS and NAMCS extrapolations using the patient visit weight. ^ Unit cost. The resource cost variation is $200 million, inpatient is $22 million and ambulatory is $178 million. The inpatient variation of $22 million is composed of $19 million in hospital per day rates, due to a higher cost per day in the PBMC&BM, and $3 million in physician visit rates, due to a higher cost per visit in the PBMC&BM. ^ The ambulatory cost variance is $178 million, composed of higher per-physician-visit costs of $97 million and higher per-ancillary costs of $81 million. Both are attributed to the PBMC&BM's precise identification of resource utilization that permits accurate valuation. ^ Conclusion. Both methods have specific limitations. The PBSM strengths are its sample designs that lead to nationally representative estimates and permit statistical point and confidence interval estimation for the nation for certain variables under investigation. However, the findings of this investigation suggest the internal validity of the estimates derived is questionable and important additional information required to precisely estimate the cost of an illness is absent. ^ The PBMC&BM is a superior method in identifying resources utilized in the physician encounter with the patient permitting more accurate valuation. However, the PBMC&BM does not have the statistical reliability of the PBSM; it relies on synthesized national prevalence estimates to extrapolate a national cost estimate. While precision is important, the ability to generalize to the nation may be limited due to the small number of patients that are followed. ^
Resumo:
The purpose of this study was to examine factors that may be associated with benzodiazepine (BZ) self-administration and risks of dependence in anxious patients. Preliminary work included examination of psychosocial characteristics and subjective drug response as potential predictors of medication use. Fifty-five M, F patients with generalized anxiety or panic disorder participated in a 3-week outpatient Choice Procedure in which they self-medicated “as needed” with alprazolam (Alz) and placebo. Findings showed that a large amount of variance in alprazolam preference, frequency, and quantity of use could be predicted by measures of anxiety, drug liking, and certain personality characteristics. The primary study extended this work by examining whether individual differences in Alz sensitivity also predict patterns of use. Twenty anxious patients participated in the study, which required 11 weekly clinic visits. Ten of these also participated in a baseline assessment of HPA-axis function that involved 24-hour monitoring of cortisol and ACTH levels and a CRH Stimulation Test. This assessment was conducted on the basis of prior evidence that steroid metabolites exert neuromodulatory effects on the GABA A receptor and that HPA-axis function may be related to BZ sensitivity and long-term disability in anxious patients. Patients were classified as either HIGH or LOW users based on their p.r.n. patterns of Alz use during the first 3 weeks of the study. They then participated in a 4-week dose response trial in which they received prescribed doses of medication (placebo, 0.25, 0.5, and 1.0mg Alz), each taken TID for 1 week. The dose response trial was followed by a second 3-week Choice Procedure. Findings were not indicative of biological differences in Alz sensitivity between the HIGH and LOW users. However, the HIGH users had higher baseline anxiety and greater anxiolytic response to Alz than the LOW users. Anxiolytic benefits of p.r.n. and prescribed dosing were shown to be comparable, and patients' conservative patterns of p.r.n. medication use were not affected by the period of prescribed dosing. Although there was not strong evidence to suggest relationships between HPA-axis function and Alz use or sensitivity, interesting findings emerged about the relationship between HPA-axis function and anxiety. ^
Resumo:
Much advancement has been made in recent years in field data assimilation, remote sensing and ecosystem modeling, yet our global view of phytoplankton biogeography beyond chlorophyll biomass is still a cursory taxonomic picture with vast areas of the open ocean requiring field validations. High performance liquid chromatography (HPLC) pigment data combined with inverse methods offer an advantage over many other phytoplankton quantification measures by way of providing an immediate perspective of the whole phytoplankton community in a sample as a function of chlorophyll biomass. Historically, such chemotaxonomic analysis has been conducted mainly at local spatial and temporal scales in the ocean. Here, we apply a widely tested inverse approach, CHEMTAX, to a global climatology of pigment observations from HPLC. This study marks the first systematic and objective global application of CHEMTAX, yielding a seasonal climatology comprised of ~1500 1°x1° global grid points of the major phytoplankton pigment types in the ocean characterizing cyanobacteria, haptophytes, chlorophytes, cryptophytes, dinoflagellates, and diatoms, with results validated against prior regional studies where possible. Key findings from this new global view of specific phytoplankton abundances from pigments are a) the large global proportion of marine haptophytes (comprising 32 ± 5% of total chlorophyll), whose biogeochemical functional roles are relatively unknown, and b) the contrasting spatial scales of complexity in global community structure that can be explained in part by regional oceanographic conditions. These publicly accessible results will guide future parameterizations of marine ecosystem models exploring the link between phytoplankton community structure and marine biogeochemical cycles.
Resumo:
Background: Postpartum hemorrhage (PPH) remains a major killer of women worldwide. Standard uterotonic treatments used to control postpartum bleeding do not always work and are not always available. Misoprostol's potential as a treatment option for PPH is increasingly known, but its use remains ad hoc and available evidence does not support the safety or efficacy of one particular regimen. This study aimed to determine the adjunct benefit of misoprostol when combined with standard oxytocics for PPH treatment. Methods: A randomized controlled trial was conducted in four Karachi hospitals from December 2005 – April 2007 to assess the benefit of a 600 mcg dose of misoprostol given sublingually in addition to standard oxytocics for postpartum hemorrhage treatment. Consenting women had their blood loss measured after normal vaginal delivery and were enrolled in the study after losing more than 500 ml of blood. Women were randomly assigned to receive either 600 mcg sublingual misoprostol or matching placebo in addition to standard PPH treatment with injectable oxytocics. Both women and providers were blinded to the treatment assignment. Blood loss was collected until active bleeding stopped and for a minimum of one hour after PPH diagnosis. Total blood loss, hemoglobin measures, and treatment outcomes were recorded for all participants. Results: Due to a much lower rate of PPH than expected (1.2%), only sixty-one patients were diagnosed and treated for their PPH in this study, and we were therefore unable to measure statistical significance in any of the primary endpoints. The addition of 600 mcg sublingual misoprostol to standard PPH treatments does, however, suggest a trend in reduced postpartum blood loss, a smaller drop in postpartum hemoglobin, and need for fewer additional interventions. Women who bled less overall had a significantly smaller drop in hemoglobin and received fewer additional interventions. There were no hysterectomies or maternal deaths among study participants. The rate of transient shivering and fever was significantly higher among women receiving misoprostol Conclusion: A 600 mcg dose of misoprostol given sublingually shows promise as an adjunct treatment for PPH and its use should continue to be explored for its life-saving potential in the care of women experiencing PPH. Trial Registration: Clinical trials.gov, Registry No. NCT00116480
Resumo:
“Import content of exports”, based on Leontief’s demand-driven input-output model, has been widely used as an indicator to measure a country’s degree of participation in vertical specialisation trade. At a sectoral level, this indicator represents the share of inter-mediates imported by all sectors embodied in a given sector’s exported output. However, this indicator only reflects one aspect of vertical specialisation – the demand side. This paper discusses the possibility of using the input-output model developed by Ghosh to measure the vertical specialisation from the perspective of the supply side. At a sector level, the Ghosh type indicator measures the share of imported intermediates used in a sector’s production that are subsequently embodied in exports by all sectors. We estimate these two indicators of vertical specialisation for 47 selected economies for 1995, 2000, 2005 using the OECD’s harmonized input-output database. In addition, the potential biases of both indicators due to the treatment of net withdrawals in inventories, are also discussed.
Resumo:
In the post-Asian crisis period, bank loans to the manufacturing sector have shown a slow recovery in the affected countries, unexceptionally in the Philippines. This paper provides a literacy survey on the effectiveness of the Central Bank’s monetary policy and the responsiveness of the financial market, and discusses on the future works necessary to better understand the monetary policy effectiveness in the Philippines. As the survey shows, most previous works focus on the correlation between the short-term policy rates and during the period of monetary tightening and relatively less interest in quantitative effectiveness. Future tasks would shed lights on (1) the asset side – other than loan outstanding – of banks to analyze their behavior/preference in structuring portfolios, and (2) the quantitative impacts during the monetary easing period.
Resumo:
This paper explores the potential usefulness of an AGE model with the Melitz-type trade specification to assess economic effects of technical regulations, taking the case of the EU ELV/RoHS directives as an example. Simulation experiments reveal that: (1) raising the fixed exporting cost to make sales in the EU market brings results that exports of the targeted commodities (motor vehicles and parts for ELV and electronic equipment for RoHS) to the EU from outside regions/countries expand while the domestic trade in the EU shrinks when the importer's preference for variety (PfV) is not strong; (2) if the PfV is not strong, policy changes that may bring reduction in the number of firms enable survived producers with high productivity to expand production to be large-scale mass producers fully enjoying the fruit of economies of scale; and (3) When the strength of the importer's PfV is changed from zero to unity, there is the value that totally changes simulation results and their interpretations.
Resumo:
The construction industry, one of the most important ones in the development of a country, generates unavoidable impacts on the environment. The social demand towards greater respect for the environment is a high and general outcry. Therefore, the construction industry needs to reduce the impact it produces. Proper waste management is not enough; we must take a further step in environmental management, where new measures need to be introduced for the prevention at source, such as good practices to promote recycling. Following the amendment of the legal frame applicable to Construction and Demolition Waste (C&D waste), important developments have been incorporated in European and International laws, aiming to promote the culture of reusing and recycling. This change of mindset, that is progressively taking place in society, is allowing for the consideration of C&D waste no longer as an unusable waste, but as a reusable material. The main objective of the work presented in this paper is to enhance C&D waste management systems through the development of preventive measures during the construction process. These measures concern all the agents intervening in the construction process as only the personal implication of all of them can ensure an efficient management of the C&D waste generated. Finally, a model based on preventive measures achieves organizational cohesion between the different stages of the construction process, as well as promoting the conservation of raw materials through the use and waste minimization. All of these in order to achieve a C&D waste management system, whose primary goal is zero waste generation
Resumo:
ICTs account nowadays for 2% of total carbon emissions. However, in a time when strict measures to reduce energyconsumption in all the industrial and services sectors are required, the ICT sector faces an increase in services and bandwidth demand. The deployment of NextGenerationNetworks (NGN) will be the answer to this new demand and specifically, the NextGenerationAccessNetworks (NGANs) will provide higher bandwidth access to users. Several policy and cost analysis are being carried out to understand the risks and opportunities of new deployments, though the question of which is the role of energyconsumption in NGANs seems off the table. Thus, this paper proposes amodel to analyze the energyconsumption of the main fiber-based NGAN architectures, i.e. Fiber To The House (FTTH) in both Passive Optical Network (PON) and Point-to-Point (PtP) variations, and FTTx/VDSL. The aim of this analysis is to provide deeper insight on the impact of new deployments on the energyconsumption of the ICT sector and the effects of energyconsumption on the life-cycle cost of NGANs. The paper presents also an energyconsumption comparison of the presented architectures, particularized in the specific geographic and demographic distribution of users of Spain, but easily extendable to other countries.
Resumo:
Improving the security of mobile phones is one of the crucial points required to assure the personal information and the operations that can be performed from them. This article presents an authentication procedure consisting of verifying the identity of people by making a signature in the air while holding the mobile phone. Different temporal distance algorithms have been proposed and evaluated through a database of 50 people making their signatures in the air and 6 people trying to forge each of them by studying their records. Approaches based on DTW have obtained better EER results than those based on LCS (2.80% against 3.34%). Besides, different signal normalization methods have been evaluated not finding any with better EER results that when no normalization has carried out.
Resumo:
European cities are essential in the development of Europe as they constitute the living environment of more than 60% of the population in the European Union and are drivers of the European economy – just under 85% of the EU’s gross domestic product is produced in urban areas (EC, 2007a). The car has been one of the main factors of development during the 20th century, but it is at the same time the origin of the key problems cities have to face: traffic increase. This has resulted in chronic congestion with many adverse consequences such as air pollution and noise. This loss of environmental quality is one of the reasons for urban sprawl in European cities during recent decades. But this urban sprawl at the same time worsens the environmental conditions. We must return to the dense city, but clean and competitive, and this implies reducing car use yet provides quality transport alternatives sufficient to recover and maintain the competitiveness of cities (EC, 2007a). Consequently, European cities need to establish an urban transport strategy which helps reduce their environmental problems –mainly emissions and noise – but without decreasing their trip attraction. This aspect is very important because a loss of trip attraction would result in an increase of people moving to more disperse areas, contributing towards worsening the current situation. This thesis is an attempt to contribute solutions to this problem in two ways: 1) The first is to analyze the complementarity and possible synergies of several urban transport measures aimed at improving a modal split to a more sustainable means of transport. This analysis will focus on the three aspects already mentioned: emissions, noise and attractiveness or competitiveness. 2) Once possible synergies and complementarities have been analyzed, the second objective is to propose the best combination of these measures, in terms of level of implementation, to achieve the maximum benefit with respect to the three aspects previously established: emissions, noise and attractiveness or competitiveness. Therefore, within the wide range of measures enhancing sustainable urban transport, three of them have been be selected in this thesis to establish a methodology for achieving these objectives. The analysis will be based on the region of Madrid, which is also the case study selected for this research. Las ciudades europeas son piezas fundamentales para el desarrollo europeo, ya que son el lugar de residencia de más del 60% de la población de la unión europea así como los motores de su economía – casi el 85% del PIB europeo se produce en áreas urbanas (EC, 2007a). El coche ha sido uno de los principales motores de desarrollo de las ciudades durante el siglo XX, pero se ha terminado por convertir a su vez en uno de los principales problemas con los que tiene que lidiar las ciudades: el aumento del tráfico. Esto ha derivado en unos niveles crónicos de congestión, con multitud de efectos adversos, entre los que cabe destacar la contaminación del aire y el ruido. Esta pérdida de calidad ambiental es una de las razones que ha propiciado la dispersión urbana que han experimentado las ciudades europeas en las últimas décadas. Pero esta dispersión urbana a su vez contribuye a empeorar las condiciones ambientales de las ciudades. Debemos retornar a la ciudad densa, pero limpia y competitiva, y esto implica reducir el uso del coche, pero proporcionando alternativas de transporte que permitan recuperar y mantener la competitividad de las ciudades (EC, 2007a). Por lo tanto, las ciudades europeas necesitan encontrar una estrategia de transporte urbano que ayude a reducir sus problemas medio ambientales – principalmente ruido y emisiones – pero sin hacerlas perder atractividad o competitividad. Este aspecto tiene gran importancia porque una pérdida de la misma se traduciría en un aumento de dispersión de la población hacia áreas periféricas, contribuyendo a empeorar la situación actual. Esta tesis contribuye a solucionar este problema de dos maneras: 1) La primera, analizando la complementariedad y posibles sinergias de diferentes medidas de transporte urbano orientadas a promover un reparto modal hacia modos más sostenibles. Este análisis se centrará en los tres aspectos anteriormente citados: emisiones, ruido y atractividad o competitividad. 2) Una vez las posibles sinergias y complementariedades se han analizado, el segundo objetivo es proponer la mejor combinación de estas medidas – en términos de grado de aplicación - para lograr el máximo beneficio en lo que respecta a los tres objetivos previamente establecidos. Para ello, en esta tesis se han seleccionado una serie de medidas que permitan establecer una metodología para alcanzar estos objetivos previamente definidos. El análisis se centra en la ciudad de Madrid y su área metropolitana, la cual se ha escogido como caso de estudio para realizar esta investigación.
Resumo:
Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.
Resumo:
Recommender systems play an important role in reducing the negative impact of informa- tion overload on those websites where users have the possibility of voting for their prefer- ences on items. The most normal technique for dealing with the recommendation mechanism is to use collaborative filtering, in which it is essential to discover the most similar users to whom you desire to make recommendations. The hypothesis of this paper is that the results obtained by applying traditional similarities measures can be improved by taking contextual information, drawn from the entire body of users, and using it to cal- culate the singularity which exists, for each item, in the votes cast by each pair of users that you wish to compare. As such, the greater the measure of singularity result between the votes cast by two given users, the greater the impact this will have on the similarity. The results, tested on the Movielens, Netflix and FilmAffinity databases, corroborate the excellent behaviour of the singularity measure proposed.
Resumo:
Collaborative filtering recommender systems contribute to alleviating the problem of information overload that exists on the Internet as a result of the mass use of Web 2.0 applications. The use of an adequate similarity measure becomes a determining factor in the quality of the prediction and recommendation results of the recommender system, as well as in its performance. In this paper, we present a memory-based collaborative filtering similarity measure that provides extremely high-quality and balanced results; these results are complemented with a low processing time (high performance), similar to the one required to execute traditional similarity metrics. The experiments have been carried out on the MovieLens and Netflix databases, using a representative set of information retrieval quality measures.
Resumo:
Presentación realizada en el PhD Seminar del ITS 2011 en Budapest. ICTs (Information and Communication Technologies) currently account for 2% of total carbon emissions. However, although modern standards require strict measures to reduce energy consumption across all industrial and services sectors, the ICT sector also faces an increase in services and bandwidth demand. The deployment of Next Generation Networks (NGN) will be the answer to this new demand; more specifically, Next Generation Access Networks (NGANs) will provide higher bandwidth access to users. Several policy and cost analyses are being carried out to understand the risks and opportunities of new deployments, but the question of what role energy consumption plays in NGANs seems off the table. Thus, this paper proposes a model to analyse the energy consumption of the main fibre-based NGAN architectures: Fibre To The House (FTTH), in both Passive Optical Network (PON) and Point-to-Point (PtP) variations, and FTTx/VDSL. The aim of this analysis is to provide deeper insight on the impact of new deployments on the energy consumption of the ICT sector and the effects of energy consumption on the life-cycle cost of NGANs. The paper also presents an energy consumption comparison of the presented architectures, particularised to the specific geographic and demographic distribution of users of Spain but easily extendable to other countries.