898 resultados para sisäinen benchmarking


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Due to the implementation of the diagnosis-related groups (DRG) system, the competitive pressure on German hospitals increased. In this context it has been shown that acute pain management offers economic benefits for hospitals. The aim of this study was to analyze the impact of the competitive situation, the ownership and the economic resources required on structures and processes for acute pain management. MATERIAL AND METHODS A standardized questionnaire on structures and processes of acute pain management was mailed to the 885 directors of German departments of anesthesiology listed as members of the German Society of Anesthesiology and Intensive Care Medicine (DGAI, Deutsche Gesellschaft für Anästhesiologie und Intensivmedizin). RESULTS For most hospitals a strong regional competition existed; however, this parameter affected neither the implementation of structures nor the recommended treatment processes for pain therapy. In contrast, a clear preference for hospitals in private ownership to use the benchmarking tool QUIPS (quality improvement in postoperative pain therapy) was found. These hospitals also presented information on coping with the management of pain in the corporate clinic mission statement more often and published information about the quality of acute pain management in the quality reports more frequently. No differences were found between hospitals with different forms of ownership in the implementation of acute pain services, quality circles, expert standard pain management and the implementation of recommended processes. Hospitals with a higher case mix index (CMI) had a certified acute pain management more often. The corporate mission statement of these hospitals also contained information on how to cope with pain, presentation of the quality of pain management in the quality report, implementation of quality circles and the implementation of the expert standard pain management more frequently. There were no differences in the frequency of using the benchmarking tool QUIPS or the implementation of recommended treatment processes with respect to the CMI. CONCLUSION In this survey no effect of the competitive situation of hospitals on acute pain management could be demonstrated. Private ownership and a higher CMI were more often associated with structures of acute pain management which were publicly accessible in terms of hospital marketing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evaluation for European Union market approval of coronary stents falls under the Medical Device Directive that was adopted in 1993. Specific requirements for the assessment of coronary stents are laid out in supplementary advisory documents. In response to a call by the European Commission to make recommendations for a revision of the advisory document on the evaluation of coronary stents (Appendix 1 of MEDDEV 2.7.1), the European Society of Cardiology (ESC) and the European Association of Percutaneous Cardiovascular Interventions (EAPCI) established a Task Force to develop an expert advisory report. As basis for its report, the ESC-EAPCI Task Force reviewed existing processes, established a comprehensive list of all coronary drug-eluting stents that have received a CE mark to date, and undertook a systematic review of the literature of all published randomized clinical trials evaluating clinical and angiographic outcomes of coronary artery stents between 2002 and 2013. Based on these data, the TF provided recommendations to inform a new regulatory process for coronary stents. The main recommendations of the task force include implementation of a standardized non-clinical assessment of stents and a novel clinical evaluation pathway for market approval. The two-stage clinical evaluation plan includes recommendation for an initial pre-market trial with objective performance criteria (OPC) benchmarking using invasive imaging follow-up leading to conditional CE-mark approval and a subsequent mandatory, large-scale randomized trial with clinical endpoint evaluation leading to unconditional CE-mark. The data analysis from the systematic review of the Task Force may provide a basis for determination of OPC for use in future studies. This paper represents an executive summary of the Task Force's report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND After the introduction of instruments for benchmarking, certification and a national guideline for acute pain management, the aim of this study was to describe the current structure, processes and quality of German acute pain services (APS). METHODS All directors of German departments of anaesthesiology were invited to complete a postal questionnaire on structures und processes of acute pain management. The survey asked for staff, techniques and quality criteria, which enabled a comparison to previous data from 1999 and surveys from other countries. RESULTS Four hundred and eight (46%) questionnaires were returned. APS have increased considerably and are now available in 81% of the hospitals, mainly anaesthesia based. However, only 45% fulfilled the minimum quality criteria, such as the assignment of personnel, the organization of patient care during nights and weekends, written protocols for postoperative pain management, regular assessments and documenting pain scores. Staff resources varied considerably, but increased compared to 1999. Two daily rounds were performed in 71%, either by physicians and nurses (42%), by physicians only (25%) or by supervised nurses (31%). Most personnel assigned to the APS shared this work along with other duties. Only 53% of the hospitals had an integrated rotation for training their specialty trainees. CONCLUSIONS The availability of APS in Germany and other countries has increased over the last decade; however, the quality of nearly half of the APS is questionable. Against the disillusioning background of recently reported unfavourable pain-related patient outcomes, the structures, organization and quality of APS should be revisited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polymorbid patients, diverse diagnostic and therapeutic options, more complex hospital structures, financial incentives, benchmarking, as well as perceptional and societal changes put pressure on medical doctors, specifically if medical errors surface. This is particularly true for the emergency department setting, where patients face delayed or erroneous initial diagnostic or therapeutic measures and costly hospital stays due to sub-optimal triage. A "biomarker" is any laboratory tool with the potential better to detect and characterise diseases, to simplify complex clinical algorithms and to improve clinical problem solving in routine care. They must be embedded in clinical algorithms to complement and not replace basic medical skills. Unselected ordering of laboratory tests and shortcomings in test performance and interpretation contribute to diagnostic errors. Test results may be ambiguous with false positive or false negative results and generate unnecessary harm and costs. Laboratory tests should only be ordered, if results have clinical consequences. In studies, we must move beyond the observational reporting and meta-analysing of diagnostic accuracies for biomarkers. Instead, specific cut-off ranges should be proposed and intervention studies conducted to prove outcome relevant impacts on patient care. The focus of this review is to exemplify the appropriate use of selected laboratory tests in the emergency setting for which randomised-controlled intervention studies have proven clinical benefit. Herein, we focus on initial patient triage and allocation of treatment opportunities in patients with cardiorespiratory diseases in the emergency department. The following five biomarkers will be discussed: proadrenomedullin for prognostic triage assessment and site-of-care decisions, cardiac troponin for acute myocardial infarction, natriuretic peptides for acute heart failure, D-dimers for venous thromboembolism, C-reactive protein as a marker of inflammation, and procalcitonin for antibiotic stewardship in infections of the respiratory tract and sepsis. For these markers we provide an overview on physiopathology, historical evolution of evidence, strengths and limitations for a rational implementation into clinical algorithms. We critically discuss results from key intervention trials that led to their use in clinical routine and potential future indications. The rational for the use of all these biomarkers, first, tackle diagnostic ambiguity and consecutive defensive medicine, second, delayed and sub-optimal therapeutic decisions, and third, prognostic uncertainty with misguided triage and site-of-care decisions all contributing to the waste of our limited health care resources. A multifaceted approach for a more targeted management of medical patients from emergency admission to discharge including biomarkers, will translate into better resource use, shorter length of hospital stay, reduced overall costs, improved patients satisfaction and outcomes in terms of mortality and re-hospitalisation. Hopefully, the concepts outlined in this review will help the reader to improve their diagnostic skills and become more parsimonious laboratory test requesters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND There are no specific Swiss home parenteral nutrition (HPN) data showing patient characteristics, quality of life (QoL) and complications. The goal of this study was to collect representative nationwide data on current adult HPN patients in Switzerland for international comparability and benchmarking. METHODS This was a multicenter, nationwide, observational study. We conducted interviews for demographics, PN characteristics, QoL and complications. The data were assessed at baseline and after a follow-up of 3 months using a questionnaire. RESULTS Thirty-three adult patients were included. The most common underlying diseases were cancer, radiation enteritis and state after bariatric surgery, and the most prevalent indication was short bowel syndrome. During the 3-month observation period, significant increase or stabilization of body weight occurred in the patients, physical activity scores improved from 34.0 to 39.4 and mental scores improved from 41.9 to 46.4. HPN dependency and traveling restrictions were of the greatest concern. Diarrhea, xerostomia and/or thirst were frequent complaints. CONCLUSION Anthropometric parameters and QoL improved during the observational period in this HPN cohort. These Swiss HPN data are prerequisite for evaluation and comparison of HPN recommendations and best clinical practice, status of professional care instructions related to HPN effectiveness, quality of treatment and patient safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Next to leisure, sport, and household activities, the most common activity resulting in medically consulted injuries and poisonings in the United States is work, with an estimated 4 million workplace related episodes reported in 2008 (U.S. Department of Health and Human Services, 2009). To address the risks inherent to various occupations, risk management programs are typically put in place that include worker training, engineering controls, and personal protective equipment. Recent studies have shown that such interventions alone are insufficient to adequately manage workplace risks, and that the climate in which the workers and safety program exist (known as the "safety climate") is an equally important consideration. The organizational safety climate is so important that many studies have focused on developing means of measuring it in various work settings. While safety climate studies have been reported for several industrial settings, published studies on assessing safety climate in the university work setting are largely absent. Universities are particularly unique workplaces because of the potential exposure to a diversity of agents representing both acute and chronic risks. Universities are also unique because readily detectable health and safety outcomes are relatively rare. The ability to measure safety climate in a work setting with rarely observed systemic outcome measures could serve as a powerful means of measure for the evaluation of safety risk management programs. ^ The goal of this research study was the development of a survey tool to measure safety climate specifically in the university work setting. The use of a standardized tool also allows for comparisons among universities throughout the United States. A specific study objective was accomplished to quantitatively assess safety climate at five universities across the United States. At five universities, 971 participants completed an online questionnaire to measure the safety climate. The average safety climate score across the five universities was 3.92 on a scale of 1 to 5, with 5 indicating very high perceptions of safety at these universities. The two lowest overall dimensions of university safety climate were "acknowledgement of safety performance" and "department and supervisor's safety commitment". The results underscore how the perception of safety climate is significantly influenced at the local level. A second study objective regarding evaluating the reliability and validity of the safety climate questionnaire was accomplished. A third objective fulfilled was to provide executive summaries resulting from the questionnaire to the participating universities' health & safety professionals and collect feedback on usefulness, relevance and perceived accuracy. Overall, the professionals found the survey and results to be very useful, relevant and accurate. Finally, the safety climate questionnaire will be offered to other universities for benchmarking purposes at the annual meeting of a nationally recognized university health and safety organization. The ultimate goal of the project was accomplished and was the creation of a standardized tool that can be used for measuring safety climate in the university work setting and can facilitate meaningful comparisons amongst institutions.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oncologic specialty societies and multidisciplinary collaborative groups have dedicated considerable effort to developing evidence-based quality indicators (QIs) to facilitate quality improvement, accreditation, benchmarking, reimbursement, maintenance of certification, and regulatory reporting. In particular, radiation oncology as a field has a long history of organized quality assessment efforts, and continues to work toward developing consensus quality standards in the face of continually evolving technologies and standards of care. The present report provides a comprehensive review of the current state of quality assessment in radiation oncology, with an emphasis on recent quality improvement efforts. Specifically, this report aims to highlight implications of the healthcare quality movement for radiation oncology and review existing efforts to define and measure quality in the field, with particular focus on dimensions of quality that are specific to radiation oncology within the "big picture" of oncologic quality improvement efforts.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present and examine a multi-sensor global compilation of mid-Holocene (MH) sea surface temperatures (SST), based on Mg/Ca and alkenone palaeothermometry and reconstructions obtained using planktonic foraminifera and organic-walled dinoflagellate cyst census counts. We assess the uncertainties originating from using different methodologies and evaluate the potential of MH SST reconstructions as a benchmark for climate-model simulations. The comparison between different analytical approaches (time frame, baseline climate) shows the choice of time window for the MH has a negligible effect on the reconstructed SST pattern, but the choice of baseline climate affects both the magnitude and spatial pattern of the reconstructed SSTs. Comparison of the SST reconstructions made using different sensors shows significant discrepancies at a regional scale, with uncertainties often exceeding the reconstructed SST anomaly. Apparent patterns in SST may largely be a reflection of the use of different sensors in different regions. Overall, the uncertainties associated with the SST reconstructions are generally larger than the MH anomalies. Thus, the SST data currently available cannot serve as a target for benchmarking model simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-latitude ecosystems play an important role in the global carbon cycle and in regulating the climate system and are presently undergoing rapid environmental change. Accurate land cover data sets are required to both document these changes as well as to provide land-surface information for benchmarking and initializing Earth system models. Earth system models also require specific land cover classification systems based on plant functional types (PFTs), rather than species or ecosystems, and so post-processing of existing land cover data is often required. This study compares over Siberia, multiple land cover data sets against one another and with auxiliary data to identify key uncertainties that contribute to variability in PFT classifications that would introduce errors in Earth system modeling. Land cover classification systems from GLC 2000, GlobCover 2005 and 2009, and MODIS collections 5 and 5.1 are first aggregated to a common legend, and then compared to high-resolution land cover classification systems, vegetation continuous fields (MODIS VCFs) and satellite-derived tree heights (to discriminate against sparse, shrub, and forest vegetation). The GlobCover data set, with a lower threshold for tree cover and taller tree heights and a better spatial resolution, tends to have better distributions of tree cover compared to high-resolution data. It has therefore been chosen to build new PFT maps for the ORCHIDEE land surface model at 1 km scale. Compared to the original PFT data set, the new PFT maps based on GlobCover 2005 and an updated cross-walking approach mainly differ in the characterization of forests and degree of tree cover. The partition of grasslands and bare soils now appears more realistic compared with ground truth data. This new vegetation map provides a framework for further development of new PFTs in the ORCHIDEE model like shrubs, lichens and mosses, to represent the water and carbon cycles in northern latitudes better. Updated land cover data sets are critical for improving and maintaining the relevance of Earth system models for assessing climate and human impacts on biogeochemistry and biophysics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente Tesis Doctoral aborda la aplicación de métodos meshless, o métodos sin malla, a problemas de autovalores, fundamentalmente vibraciones libres y pandeo. En particular, el estudio se centra en aspectos tales como los procedimientos para la resolución numérica del problema de autovalores con estos métodos, el coste computacional y la viabilidad de la utilización de matrices de masa o matrices de rigidez geométrica no consistentes. Además, se acomete en detalle el análisis del error, con el objetivo de determinar sus principales fuentes y obtener claves que permitan la aceleración de la convergencia. Aunque en la actualidad existe una amplia variedad de métodos meshless en apariencia independientes entre sí, se han analizado las diferentes relaciones entre ellos, deduciéndose que el método Element-Free Galerkin Method [Método Galerkin Sin Elementos] (EFGM) es representativo de un amplio grupo de los mismos. Por ello se ha empleado como referencia en este análisis. Muchas de las fuentes de error de un método sin malla provienen de su algoritmo de interpolación o aproximación. En el caso del EFGM ese algoritmo es conocido como Moving Least Squares [Mínimos Cuadrados Móviles] (MLS), caso particular del Generalized Moving Least Squares [Mínimos Cuadrados Móviles Generalizados] (GMLS). La formulación de estos algoritmos indica que la precisión de los mismos se basa en los siguientes factores: orden de la base polinómica p(x), características de la función de peso w(x) y forma y tamaño del soporte de definición de esa función. Se ha analizado la contribución individual de cada factor mediante su reducción a un único parámetro cuantificable, así como las interacciones entre ellos tanto en distribuciones regulares de nodos como en irregulares. El estudio se extiende a una serie de problemas estructurales uni y bidimensionales de referencia, y tiene en cuenta el error no sólo en el cálculo de autovalores (frecuencias propias o carga de pandeo, según el caso), sino también en términos de autovectores. This Doctoral Thesis deals with the application of meshless methods to eigenvalue problems, particularly free vibrations and buckling. The analysis is focused on aspects such as the numerical solving of the problem, computational cost and the feasibility of the use of non-consistent mass or geometric stiffness matrices. Furthermore, the analysis of the error is also considered, with the aim of identifying its main sources and obtaining the key factors that enable a faster convergence of a given problem. Although currently a wide variety of apparently independent meshless methods can be found in the literature, the relationships among them have been analyzed. The outcome of this assessment is that all those methods can be grouped in only a limited amount of categories, and that the Element-Free Galerkin Method (EFGM) is representative of the most important one. Therefore, the EFGM has been selected as a reference for the numerical analyses. Many of the error sources of a meshless method are contributed by its interpolation/approximation algorithm. In the EFGM, such algorithm is known as Moving Least Squares (MLS), a particular case of the Generalized Moving Least Squares (GMLS). The accuracy of the MLS is based on the following factors: order of the polynomial basis p(x), features of the weight function w(x), and shape and size of the support domain of this weight function. The individual contribution of each of these factors, along with the interactions among them, has been studied in both regular and irregular arrangement of nodes, by means of a reduction of each contribution to a one single quantifiable parameter. This assessment is applied to a range of both one- and two-dimensional benchmarking cases, and includes not only the error in terms of eigenvalues (natural frequencies or buckling load), but also of eigenvectors

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis contributes to the analysis and design of printed reflectarray antennas. The main part of the work is focused on the analysis of dual offset antennas comprising two reflectarray surfaces, one of them acts as sub-reflector and the second one acts as mainreflector. These configurations introduce additional complexity in several aspects respect to conventional dual offset reflectors, however they present a lot of degrees of freedom that can be used to improve the electrical performance of the antenna. The thesis is organized in four parts: the development of an analysis technique for dualreflectarray antennas, a preliminary validation of such methodology using equivalent reflector systems as reference antennas, a more rigorous validation of the software tool by manufacturing and testing a dual-reflectarray antenna demonstrator and the practical design of dual-reflectarray systems for some applications that show the potential of these kind of configurations to scan the beam and to generate contoured beams. In the first part, a general tool has been implemented to analyze high gain antennas which are constructed of two flat reflectarray structures. The classic reflectarray analysis based on MoM under local periodicity assumption is used for both sub and main reflectarrays, taking into account the incident angle on each reflectarray element. The incident field on the main reflectarray is computed taking into account the field radiated by all the elements on the sub-reflectarray.. Two approaches have been developed, one which employs a simple approximation to reduce the computer run time, and the other which does not, but offers in many cases, improved accuracy. The approximation is based on computing the reflected field on each element on the main reflectarray only once for all the fields radiated by the sub-reflectarray elements, assuming that the response will be the same because the only difference is a small variation on the angle of incidence. This approximation is very accurate when the reflectarray elements on the main reflectarray show a relatively small sensitivity to the angle of incidence. An extension of the analysis technique has been implemented to study dual-reflectarray antennas comprising a main reflectarray printed on a parabolic surface, or in general in a curved surface. In many applications of dual-reflectarray configurations, the reflectarray elements are in the near field of the feed-horn. To consider the near field radiated by the horn, the incident field on each reflectarray element is computed using a spherical mode expansion. In this region, the angles of incidence are moderately wide, and they are considered in the analysis of the reflectarray to better calculate the actual incident field on the sub-reflectarray elements. This technique increases the accuracy for the prediction of co- and cross-polar patterns and antenna gain respect to the case of using ideal feed models. In the second part, as a preliminary validation, the proposed analysis method has been used to design a dual-reflectarray antenna that emulates previous dual-reflector antennas in Ku and W-bands including a reflectarray as subreflector. The results for the dualreflectarray antenna compare very well with those of the parabolic reflector and reflectarray subreflector; radiation patterns, antenna gain and efficiency are practically the same when the main parabolic reflector is substituted by a flat reflectarray. The results show that the gain is only reduced by a few tenths of a dB as a result of the ohmic losses in the reflectarray. The phase adjustment on two surfaces provided by the dual-reflectarray configuration can be used to improve the antenna performance in some applications requiring multiple beams, beam scanning or shaped beams. Third, a very challenging dual-reflectarray antenna demonstrator has been designed, manufactured and tested for a more rigorous validation of the analysis technique presented. The proposed antenna configuration has the feed, the sub-reflectarray and the main-reflectarray in the near field one to each other, so that the conventional far field approximations are not suitable for the analysis of such antenna. This geometry is used as benchmarking for the proposed analysis tool in very stringent conditions. Some aspects of the proposed analysis technique that allow improving the accuracy of the analysis are also discussed. These improvements include a novel method to reduce the inherent cross polarization which is introduced mainly from grounded patch arrays. It has been checked that cross polarization in offset reflectarrays can be significantly reduced by properly adjusting the patch dimensions in the reflectarray in order to produce an overall cancellation of the cross-polarization. The dimensions of the patches are adjusted in order not only to provide the required phase-distribution to shape the beam, but also to exploit the crosses by zero of the cross-polarization components. The last part of the thesis deals with direct applications of the technique described. The technique presented is directly applicable to the design of contoured beam antennas for DBS applications, where the requirements of cross-polarisation are very stringent. The beam shaping is achieved by synthesithing the phase distribution on the main reflectarray while the sub-reflectarray emulates an equivalent hyperbolic subreflector. Dual-reflectarray antennas present also the ability to scan the beam over small angles about boresight. Two possible architectures for a Ku-band antenna are also described based on a dual planar reflectarray configuration that provides electronic beam scanning in a limited angular range. In the first architecture, the beam scanning is achieved by introducing a phase-control in the elements of the sub-reflectarray and the mainreflectarray is passive. A second alternative is also studied, in which the beam scanning is produced using 1-bit control on the main reflectarray, while a passive subreflectarray is designed to provide a large focal distance within a compact configuration. The system aims to develop a solution for bi-directional satellite links for emergency communications. In both proposed architectures, the objective is to provide a compact optics and simplicity to be folded and deployed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wind power time series usually show complex dynamics mainly due to non-linearities related to the wind physics and the power transformation process in wind farms. This article provides an approach to the incorporation of observed local variables (wind speed and direction) to model some of these effects by means of statistical models. To this end, a benchmarking between two different families of varying-coefficient models (regime-switching and conditional parametric models) is carried out. The case of the offshore wind farm of Horns Rev in Denmark has been considered. The analysis is focused on one-step ahead forecasting and a time series resolution of 10 min. It has been found that the local wind direction contributes to model some features of the prevailing winds, such as the impact of the wind direction on the wind variability, whereas the non-linearities related to the power transformation process can be introduced by considering the local wind speed. In both cases, conditional parametric models showed a better performance than the one achieved by the regime-switching strategy. The results attained reinforce the idea that each explanatory variable allows the modelling of different underlying effects in the dynamics of wind power time series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last recent years, with the popularity of image compression techniques, many architectures have been proposed. Those have been generally based on the Forward and Inverse Discrete Cosine Transform (FDCT, IDCT). Alternatively, compression schemes based on discrete "wavelets" transform (DWT), used, both, in JPEG2000 coding standard and in H264-SVC (Scalable Video Coding) standard, do not need to divide the image into non-overlapping blocks or macroblocks. This paper discusses the DLMT (Discrete Lopez-Moreno Transform) hardware implementation. It proposes a new scheme intermediate between the DCT and the DWT, comparing results of the most relevant proposed architectures for benchmarking. The DLMT can also be applied over a whole image, but this does not involve increasing computational complexity. FPGA implementation results show that the proposed DLMT has significant performance benefits and improvements comparing with the DCT and the DWT and consequently it is very suitable for implementation on WSN (Wireless Sensor Network) applications.