936 resultados para forecasting.
Resumo:
O conhecimento dos oceanos é limitado pela disponibilidade de dados disponíveis. Este estudo mostra como modelos numéricos podem aumentar o conhecimento com pouca informação local, e de forma económica. Neste trabalho é efetuado a implementação do modelo numérico MOHID, bidimensional (2D) para simular a propagação da maré e tridimensional (3D) para estudar o escoamento baroclíncio no Arquipélago de Cabo Verde. O modelo 2D é baseado num modelo de escala regional com passo espacial de 6 km (nível 1), com dois modelos encaixados de passo espacial de 3 km (nível 2). Também foi utilizado para impor a maré (nível 0) no modelo 3D (nível 1, com passo de 6 km). O modelo tridimensional tem 50 camadas na vertical e fornece informação sobre a circulação geral. A fronteira lateral aberta foi forçada pelo modelo global de maré FES 2004 e pelos resultados do projecto MyOcean fornecendo condições de temperatura, salinidade e níveis médios diários para o modelo 3D. O forçamento meteorológico foi fornecido pelos resultados do modelo meteorológico GFS (Global Forecasting System). Os resultados dos modelos foram analisados com as informações conhecidas como as principais correntes da região, as imagens de satélite de temperatura da superfície do mar, perfis verticais obtidos pelas bóias argo e os resultados do projeto MyOcean. O modelo reproduz o padrão de circulação conhecido para a região e os resultados apresentam boa concordância com a informação disponível. O modelo mostra que o vento tem influência sobre o escoamento à superfície, mas tem pouca influência sobre os níveis.
Resumo:
In recent years there has been an explosive growth in the development of adaptive and data driven methods. One of the efficient and data-driven approaches is based on statistical learning theory (Vapnik 1998). The theory is based on Structural Risk Minimisation (SRM) principle and has a solid statistical background. When applying SRM we are trying not only to reduce training error ? to fit the available data with a model, but also to reduce the complexity of the model and to reduce generalisation error. Many nonlinear learning procedures recently developed in neural networks and statistics can be understood and interpreted in terms of the structural risk minimisation inductive principle. A recent methodology based on SRM is called Support Vector Machines (SVM). At present SLT is still under intensive development and SVM find new areas of application (www.kernel-machines.org). SVM develop robust and non linear data models with excellent generalisation abilities that is very important both for monitoring and forecasting. SVM are extremely good when input space is high dimensional and training data set i not big enough to develop corresponding nonlinear model. Moreover, SVM use only support vectors to derive decision boundaries. It opens a way to sampling optimization, estimation of noise in data, quantification of data redundancy etc. Presentation of SVM for spatially distributed data is given in (Kanevski and Maignan 2004).
Resumo:
A new parameter is introduced: the lightning potential index (LPI), which is a measure of the potential for charge generation and separation that leads to lightning flashes in convective thunderstorms. The LPI is calculated within the charge separation region of clouds between 0 C and 20 C, where the noninductive mechanism involving collisions of ice and graupel particles in the presence of supercooled water is most effective. As shown in several case studies using the Weather Research and Forecasting (WRF) model with explicit microphysics, the LPI is highly correlated with observed lightning. It is suggested that the LPI may be a useful parameter for predicting lightning as well as a tool for improving weather forecasting of convective storms and heavy rainfall.
Resumo:
This paper analyses the predictive ability of quantitative precipitation forecasts (QPF) and the so-called "poor-man" rainfall probabilistic forecasts (RPF). With this aim, the full set of warnings issued by the Meteorological Service of Catalonia (SMC) for potentially-dangerous events due to severe precipitation has been analysed for the year 2008. For each of the 37 warnings, the QPFs obtained from the limited-area model MM5 have been verified against hourly precipitation data provided by the rain gauge network covering Catalonia (NE of Spain), managed by SMC. For a group of five selected case studies, a QPF comparison has been undertaken between the MM5 and COSMO-I7 limited-area models. Although MM5's predictive ability has been examined for these five cases by making use of satellite data, this paper only shows in detail the heavy precipitation event on the 9¿10 May 2008. Finally, the "poor-man" rainfall probabilistic forecasts (RPF) issued by SMC at regional scale have also been tested against hourly precipitation observations. Verification results show that for long events (>24 h) MM5 tends to overestimate total precipitation, whereas for short events (¿24 h) the model tends instead to underestimate precipitation. The analysis of the five case studies concludes that most of MM5's QPF errors are mainly triggered by very poor representation of some of its cloud microphysical species, particularly the cloud liquid water and, to a lesser degree, the water vapor. The models' performance comparison demonstrates that MM5 and COSMO-I7 are on the same level of QPF skill, at least for the intense-rainfall events dealt with in the five case studies, whilst the warnings based on RPF issued by SMC have proven fairly correct when tested against hourly observed precipitation for 6-h intervals and at a small region scale. Throughout this study, we have only dealt with (SMC-issued) warning episodes in order to analyse deterministic (MM5 and COSMO-I7) and probabilistic (SMC) rainfall forecasts; therefore we have not taken into account those episodes that might (or might not) have been missed by the official SMC warnings. Therefore, whenever we talk about "misses", it is always in relation to the deterministic LAMs' QPFs.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
While the treatment of refractory status epilepticus (SE) relies on the use of anesthetic agents, mostly barbiturates, propofol, or midazolam, the study of the available literature discloses that the evidence level is low. Therapeutic coma induction appears straightforward for generalized convulsive or subtle SE, but this approach is debated for complex partial SE. Each anesthetic has its own advocates, and specific advantages and risks; furthermore, several different protocols have been reported regarding the duration and depth of sedation. However, it seems that the biological background of the patient (especially the etiology) remains the main prognostic determinant in SE. There is a clear need of controlled trials regarding this topic.
Resumo:
The new Swiss federal law on organ and transplantation strengthens the responsibilities of the intensive care units. In Italian and French speaking parts of Switzerland, the Programme Latin pour le Don d'Organe (PLDO) has been launched to foster a wider collaboration between intensivists and donation coordinators. The PLDO aims at optimising knowledge and expertise in organ donation through improvements in identification, notification and management of organ donors and their next of kin. The PLDO dispenses education to all professionals involved. Such organisation should allow increasing the number of organs available, while improving healthcare professionals experience and next of kin emotion throughout the donation process.
Resumo:
En los últimos 30 años la proliferación de modelos cuantitativos de predicción de la insolvencia empresarial en la literatura contable y financiera ha despertado un gran interés entre los especialistas e investigadores de lamateria. Lo que en un principio fueron unos modelos elaborados con un único objetivo, han derivado en una fuente de investigación constante.En este documento se formula un modelo de predicción de la insolvencia a través de la combinación de diferentes variables cuantitativas extraídas de los estados contables de una muestra de empresas para los años 1994-1997. A través de un procedimiento por etapas se selecciona e interpreta cuáles son las más relevantes en cuanto a aportación de información.Una vez formulado este primer tipo de modelos se busca una alternativa a las variables anteriores a través de la técnica factorial del análisis de componentes principales. Con ella se hace una selección de variables y se aplica, junto conlos ratios anteriores, el análisis univariante. Por último, se comparan los modelos obtenidos y se concluye que aunque la literatura previa ofrece mejores porcentajes de clasificación, los modelos obtenidos a través del análisis decomponentes principales no deben ser rechazados por la claridad en la explicación de las causas que conducen a una empresa a la insolvencia.
Resumo:
La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.
Resumo:
En los últimos 30 años la proliferación de modelos cuantitativos de predicción de la insolvencia empresarial en la literatura contable y financiera ha despertado un gran interés entre los especialistas e investigadores de lamateria. Lo que en un principio fueron unos modelos elaborados con un único objetivo, han derivado en una fuente de investigación constante.En este documento se formula un modelo de predicción de la insolvencia a través de la combinación de diferentes variables cuantitativas extraídas de los estados contables de una muestra de empresas para los años 1994-1997. A través de un procedimiento por etapas se selecciona e interpreta cuáles son las más relevantes en cuanto a aportación de información.Una vez formulado este primer tipo de modelos se busca una alternativa a las variables anteriores a través de la técnica factorial del análisis de componentes principales. Con ella se hace una selección de variables y se aplica, junto conlos ratios anteriores, el análisis univariante. Por último, se comparan los modelos obtenidos y se concluye que aunque la literatura previa ofrece mejores porcentajes de clasificación, los modelos obtenidos a través del análisis decomponentes principales no deben ser rechazados por la claridad en la explicación de las causas que conducen a una empresa a la insolvencia.
Resumo:
Progress in genomics with, in particular, high throughput next generation sequencing is revolutionizing oncology. The impact of these techniques is seen on the one hand the identification of germline mutations that predispose to a given type of cancer, allowing for a personalized care of patients or healthy carriers and, on the other hand, the characterization of all acquired somatic mutation of the tumor cell, opening the door to personalized treatment targeting the driver oncogenes. In both cases, next generation sequencing techniques allow a global approach whereby the integrality of the genome mutations is analyzed and correlated with the clinical data. The benefits on the quality of care delivered to our patients are extremely impressive.
Resumo:
The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).
Resumo:
In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical. We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises. Résumé français: Dans cette thèse, nous étudions l'utilisation de marchés de prédictions pour l'évaluation de nouvelles technologies. Nous nous intéressons plus particulièrement aux capacités des marchés de prédictions à évaluer des problématiques complexes, aux contraintes de conception pour une telle utilisation et à leur efficacité par rapport à des techniques traditionnelles. Pour ce faire, nous avons suivi une approche Design Science, développant itérativement plusieurs prototypes, les instanciant, puis les évaluant avant d'en raffiner la conception. Ceci nous a permis de faire de multiples contributions tant pratiques que théoriques. Nous avons tout d'abord montré que les marchés de prédictions étaient adaptés pour correctement apprécier des problématiques complexes. Nous avons également développé une typologie de facteurs de conception ainsi que des propositions de conception pour l'utilisation de ces marchés dans des contextes d'évaluation technologique. Ensuite, nous avons montré que ces marchés pouvaient résoudre une partie des problèmes liés à la gestion des portes-feuille de projets de recherche et développement et proposons une feuille de route pour leur mise en oeuvre. Finalement, en comparant la mise en oeuvre et les résultats d'une méthode de décision multi-critère et d'un marché de prédiction, nous avons montré que ces derniers étaient plus efficaces, tout en offrant des résultats semblables. Nous proposons également un cadre de comparaison des méthodes d'évaluation technologiques, permettant de cerner au mieux les besoins en fonction de facteurs de contingence. En conclusion, notre recherche ouvre un nouveau champ d'application des marchés de prédiction et devrait permettre d'accélérer leur adoption par les entreprises.
Resumo:
We present a simple model of communication in networks with hierarchical branching. We analyze the behavior of the model from the viewpoint of critical systems under different situations. For certain values of the parameters, a continuous phase transition between a sparse and a congested regime is observed and accurately described by an order parameter and the power spectra. At the critical point the behavior of the model is totally independent of the number of hierarchical levels. Also scaling properties are observed when the size of the system varies. The presence of noise in the communication is shown to break the transition. The analytical results are a useful guide to forecasting the main features of real networks.
Resumo:
La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.