793 resultados para PHARMACY-BASED MEASURES
Resumo:
European cities are essential in the development of Europe as they constitute the living environment of more than 60% of the population in the European Union and are drivers of the European economy – just under 85% of the EU’s gross domestic product is produced in urban areas (EC, 2007a). The car has been one of the main factors of development during the 20th century, but it is at the same time the origin of the key problems cities have to face: traffic increase. This has resulted in chronic congestion with many adverse consequences such as air pollution and noise. This loss of environmental quality is one of the reasons for urban sprawl in European cities during recent decades. But this urban sprawl at the same time worsens the environmental conditions. We must return to the dense city, but clean and competitive, and this implies reducing car use yet provides quality transport alternatives sufficient to recover and maintain the competitiveness of cities (EC, 2007a). Consequently, European cities need to establish an urban transport strategy which helps reduce their environmental problems –mainly emissions and noise – but without decreasing their trip attraction. This aspect is very important because a loss of trip attraction would result in an increase of people moving to more disperse areas, contributing towards worsening the current situation. This thesis is an attempt to contribute solutions to this problem in two ways: 1) The first is to analyze the complementarity and possible synergies of several urban transport measures aimed at improving a modal split to a more sustainable means of transport. This analysis will focus on the three aspects already mentioned: emissions, noise and attractiveness or competitiveness. 2) Once possible synergies and complementarities have been analyzed, the second objective is to propose the best combination of these measures, in terms of level of implementation, to achieve the maximum benefit with respect to the three aspects previously established: emissions, noise and attractiveness or competitiveness. Therefore, within the wide range of measures enhancing sustainable urban transport, three of them have been be selected in this thesis to establish a methodology for achieving these objectives. The analysis will be based on the region of Madrid, which is also the case study selected for this research. Las ciudades europeas son piezas fundamentales para el desarrollo europeo, ya que son el lugar de residencia de más del 60% de la población de la unión europea así como los motores de su economía – casi el 85% del PIB europeo se produce en áreas urbanas (EC, 2007a). El coche ha sido uno de los principales motores de desarrollo de las ciudades durante el siglo XX, pero se ha terminado por convertir a su vez en uno de los principales problemas con los que tiene que lidiar las ciudades: el aumento del tráfico. Esto ha derivado en unos niveles crónicos de congestión, con multitud de efectos adversos, entre los que cabe destacar la contaminación del aire y el ruido. Esta pérdida de calidad ambiental es una de las razones que ha propiciado la dispersión urbana que han experimentado las ciudades europeas en las últimas décadas. Pero esta dispersión urbana a su vez contribuye a empeorar las condiciones ambientales de las ciudades. Debemos retornar a la ciudad densa, pero limpia y competitiva, y esto implica reducir el uso del coche, pero proporcionando alternativas de transporte que permitan recuperar y mantener la competitividad de las ciudades (EC, 2007a). Por lo tanto, las ciudades europeas necesitan encontrar una estrategia de transporte urbano que ayude a reducir sus problemas medio ambientales – principalmente ruido y emisiones – pero sin hacerlas perder atractividad o competitividad. Este aspecto tiene gran importancia porque una pérdida de la misma se traduciría en un aumento de dispersión de la población hacia áreas periféricas, contribuyendo a empeorar la situación actual. Esta tesis contribuye a solucionar este problema de dos maneras: 1) La primera, analizando la complementariedad y posibles sinergias de diferentes medidas de transporte urbano orientadas a promover un reparto modal hacia modos más sostenibles. Este análisis se centrará en los tres aspectos anteriormente citados: emisiones, ruido y atractividad o competitividad. 2) Una vez las posibles sinergias y complementariedades se han analizado, el segundo objetivo es proponer la mejor combinación de estas medidas – en términos de grado de aplicación - para lograr el máximo beneficio en lo que respecta a los tres objetivos previamente establecidos. Para ello, en esta tesis se han seleccionado una serie de medidas que permitan establecer una metodología para alcanzar estos objetivos previamente definidos. El análisis se centra en la ciudad de Madrid y su área metropolitana, la cual se ha escogido como caso de estudio para realizar esta investigación.
Resumo:
Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.
Resumo:
Recommender systems play an important role in reducing the negative impact of informa- tion overload on those websites where users have the possibility of voting for their prefer- ences on items. The most normal technique for dealing with the recommendation mechanism is to use collaborative filtering, in which it is essential to discover the most similar users to whom you desire to make recommendations. The hypothesis of this paper is that the results obtained by applying traditional similarities measures can be improved by taking contextual information, drawn from the entire body of users, and using it to cal- culate the singularity which exists, for each item, in the votes cast by each pair of users that you wish to compare. As such, the greater the measure of singularity result between the votes cast by two given users, the greater the impact this will have on the similarity. The results, tested on the Movielens, Netflix and FilmAffinity databases, corroborate the excellent behaviour of the singularity measure proposed.
Resumo:
Collaborative filtering recommender systems contribute to alleviating the problem of information overload that exists on the Internet as a result of the mass use of Web 2.0 applications. The use of an adequate similarity measure becomes a determining factor in the quality of the prediction and recommendation results of the recommender system, as well as in its performance. In this paper, we present a memory-based collaborative filtering similarity measure that provides extremely high-quality and balanced results; these results are complemented with a low processing time (high performance), similar to the one required to execute traditional similarity metrics. The experiments have been carried out on the MovieLens and Netflix databases, using a representative set of information retrieval quality measures.
Resumo:
Presentación realizada en el PhD Seminar del ITS 2011 en Budapest. ICTs (Information and Communication Technologies) currently account for 2% of total carbon emissions. However, although modern standards require strict measures to reduce energy consumption across all industrial and services sectors, the ICT sector also faces an increase in services and bandwidth demand. The deployment of Next Generation Networks (NGN) will be the answer to this new demand; more specifically, Next Generation Access Networks (NGANs) will provide higher bandwidth access to users. Several policy and cost analyses are being carried out to understand the risks and opportunities of new deployments, but the question of what role energy consumption plays in NGANs seems off the table. Thus, this paper proposes a model to analyse the energy consumption of the main fibre-based NGAN architectures: Fibre To The House (FTTH), in both Passive Optical Network (PON) and Point-to-Point (PtP) variations, and FTTx/VDSL. The aim of this analysis is to provide deeper insight on the impact of new deployments on the energy consumption of the ICT sector and the effects of energy consumption on the life-cycle cost of NGANs. The paper also presents an energy consumption comparison of the presented architectures, particularised to the specific geographic and demographic distribution of users of Spain but easily extendable to other countries.
Resumo:
Choosing an appropriate accounting system for manufacturing has always been a challenge for managers. In this article we try to compare three accounting systems designed since 1980 to address problems of traditional accounting system. In the first place we are going to present a short overview on background and definition of three accounting systems: Activity Based costing, Time-Driven Activity Based Costing and Lean Accounting. Comparisons are made based on the three basic roles of information generated by accounting systems: financial reporting, decision making, and operational control and improvement. The analysis in this paper reveals how decisions are made over the value stream in the companies using Lean Accounting while decisions under the ABC Accounting system are taken at individual product level, and finally we will show how TD-ABC covers both product and process levels for decision making. In addition, this paper shows the importance of nonfinancial measures for operational control and improvement under the Lean Accounting and TD-ABC methods whereas ABC relies mostly on financial measures in this context.
Resumo:
Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz–Mancini–Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.
Resumo:
Today, the building sector alone accounts for 40% of the total energy consumption in the European Union (EU). In most EU member states, about 70–90% of the buildings were constructed at least 20 years ago. Due to this, these buildings have a worse energy efficiency behavior than the new ones that comply with current regulations. As a consequence, acting on the existing building stock is needed, developing special methods on assessment and advice in order to reduce the total energy consumption. This article addresses a procedure allowing the classification and characterization of existing buildings facades. It can help researchers to achieve in-depth knowledge of the facades construction and therefore knowing their thermal behavior. Once knowing that, the most appropriate upgrading strategies can be established with the purpose of reducing the energy demand. Furthermore, the classified facade typologies have been verified, complying with current and future Spanish regulations and according to the results obtained, a series of upgrading strategies based on the opaque part and those in the translucent part, have been proposed. As a conclusion, this procedure helps us to select the most appropriate improvement measures for each type of facade in order to comply with current and future Spanish regulations. This proposed method has been tested in a specific neighborhood of Madrid, in a selected period of time, between 1950 and 1980, but it could be applicable to any other city.
Resumo:
We present an approach to adapt dynamically the language models (LMs) used by a speech recognizer that is part of a spoken dialogue system. We have developed a grammar generation strategy that automatically adapts the LMs using the semantic information that the user provides (represented as dialogue concepts), together with the information regarding the intentions of the speaker (inferred by the dialogue manager, and represented as dialogue goals). We carry out the adaptation as a linear interpolation between a background LM, and one or more of the LMs associated to the dialogue elements (concepts or goals) addressed by the user. The interpolation weights between those models are automatically estimated on each dialogue turn, using measures such as the posterior probabilities of concepts and goals, estimated as part of the inference procedure to determine the actions to be carried out. We propose two approaches to handle the LMs related to concepts and goals. Whereas in the first one we estimate a LM for each one of them, in the second one we apply several clustering strategies to group together those elements that share some common properties, and estimate a LM for each cluster. Our evaluation shows how the system can estimate a dynamic model adapted to each dialogue turn, which helps to improve the performance of the speech recognition (up to a 14.82% of relative improvement), which leads to an improvement in both the language understanding and the dialogue management tasks.
Resumo:
Los sistemas de recomendación son un tipo de solución al problema de sobrecarga de información que sufren los usuarios de los sitios web en los que se pueden votar ciertos artículos. El sistema de recomendación de filtrado colaborativo es considerado como el método con más éxito debido a que sus recomendaciones se hacen basándose en los votos de usuarios similares a un usuario activo. Sin embargo, el método de filtrado de colaboración tradicional selecciona usuarios insuficientemente representativos como vecinos de cada usuario activo. Esto significa que las recomendaciones hechas a posteriori no son lo suficientemente precisas. El método propuesto en esta tesis realiza un pre-filtrado del proceso, mediante el uso de dominancia de Pareto, que elimina los usuarios menos representativos del proceso de selección k-vecino y mantiene los más prometedores. Los resultados de los experimentos realizados en MovieLens y Netflix muestran una mejora significativa en todas las medidas de calidad estudiadas en la aplicación del método propuesto. ABSTRACTRecommender systems are a type of solution to the information overload problem suffered by users of websites on which they can rate certain items. The Collaborative Filtering Recommender System is considered to be the most successful approach as it make its recommendations based on votes of users similar to an active user. Nevertheless, the traditional collaborative filtering method selects insufficiently representative users as neighbors of each active user. This means that the recommendations made a posteriori are not precise enough. The method proposed in this thesis performs a pre-filtering process, by using Pareto dominance, which eliminates the less representative users from the k-neighbor selection process and keeps the most promising ones. The results from the experiments performed on Movielens and Netflix show a significant improvement in all the quality measures studied on applying the proposed method.
Resumo:
Changing factors (mainly traffic intensity and weather conditions) affecting road conditions require a suitable optimal speed at any time. To solve this problem, variable speed limit systems (VSL) ? as opposed to fixed limits ? have been developed in recent decades. This term has included a number of speed management systems, most notably dynamic speed limits (DSL). In order to avoid the indiscriminate use of both terms in the literature, this paper proposes a simple classification and offers a review of some experiences, how their effects are evaluated and their results This study also presents a key indicator, which measures the speed homogeneity and a methodology to obtain the data based on floating cars and GPS technology applying it to a case study on a section of the M30 urban motorway in Madrid (Spain).
Resumo:
This paper presents a CMOS temperature sensor based on the thermal dependencies of the leakage currents targeting the 65 nm node. To compensate for the effect of process fluctuations, the proposed sensor realizes the ratio of two measures of the time it takes a capacitor to discharge through a transistor in the subthreshold regime. Furthermore, a novel charging mechanism for the capacitor is proposed to further increase the robustness against fabrication variability. The sensor, including digitization and interfacing, occupies 0.0016 mm2 and has an energy consumption of 47.7–633 pJ per sample. The resolution of the sensor is 0.28 °C, and the 3σ inaccuracy over the range 40–110 °C is 1.17 °C.
Resumo:
Recent developments to fit the so called Free Formulation into a variational framework have suggested the possibility of introducing a new category of error estimates for finite element computations. Such error estimates are based on differences between certain multifield functionals, which give the same value for the true solution. In the present paper the formulation of some estimates of this kind is introduced for elasticity and plate bending problems, and several examples of their performance are discussed. The observed numerical behavior of the new accuracy measures seems to be acceptable from an engineering point of view. However, further numerical experimentation is still needed to establish practical tolerance levels for real problems.
Resumo:
This paper is an introduction of the regret theory-based scenario building approach combining with a modified Delphi method that uses an interactive process to design and assess four different TDM measures (i.e., cordon toll, parking charge, increased bus frequency and decreased bus fare). The case study of Madrid is used to present the analysis and provide policy recommendations. The new scenario building approach incorporates expert judgement and transport models in an interactive process. It consists of a two-round modified Delphi survey, which was answeared by a group of Spanish transport experts who were the participants of the Transport Engineering Congress (CIT 2012), and an integrated land-use and transport model (LUTI) for Madrid that is called MARS (Metropolitan Activity Relocation Simulator).
Resumo:
In order to achieve to minimize car-based trips, transport planners have been particularly interested in understanding the factors that explain modal choices. In the transport modelling literature there has been an increasing awareness that socioeconomic attributes and quantitative variables are not sufficient to characterize travelers and forecast their travel behavior. Recent studies have also recognized that users? social interactions and land use patterns influence travel behavior, especially when changes to transport systems are introduced, but links between international and Spanish perspectives are rarely deal. In this paper, factorial and path analyses through a Multiple-Indicator Multiple-Cause (MIMIC) model are used to understand and describe the relationship between the different psychological and environmental constructs with social influence and socioeconomic variables. The MIMIC model generates Latent Variables (LVs) to be incorporated sequentially into Discrete Choice Models (DCM) where the levels of service and cost attributes of travel modes are also included directly to measure the effect of the transport policies that have been introduced in Madrid during the last three years in the context of the economic crisis. The data used for this paper are collected from a two panel smartphone-based survey (n=255 and 190 respondents, respectively) of Madrid.