831 resultados para Estimations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el presente trabajo se presentan los resultados de una investigación realizada sobre el Área de las Ciencias Sociales en México del período 1997-2006, para conocer algunas características relevantes como su evolución histórica y productividad científica (I+D) a través del volumen de documentos generados, el idioma de publicación, el índice de productividad cronológica, temática y por Entidad Federativa, los patrones de autoría y coautoría nacionales e internacionales, citación y co-citación entre publicaciones, instituciones y sub-disciplinas científicas (frentes de investigación), entre otros, utilizando para tal fin las técnica de investigación documental: análisis bibliométrico. La producción científica en Ciencias Sociales en el período estudiado representó el 8% del total de la producción mexicana, en Humanidades se logró el 1.50% y en Ciencias Aplicadas se alcanzó el 90.5%, de acuerdo a estimaciones realizadas a través de las bases de datos Citation Index del ISI. ABSTRACT During the period 1997-2006, to learn more about some important features concerning this country's historical development and scientific productivity (R & D) through the volume of documents generated, language of publication, the productivity index chronologically, thematically and by state, the patterns of authorship and national and international co-authorship, citation and co-citation between publications, institutions and sub-disciplines in science (research fronts), among other such indicators-- using the technique of documentary research: bibliometric analysis. For scientific production within the studied period, the field of Social Sciences represented 8% of the total of the Mexican production ; the field of Humanities represented 1,50% ; and the field of Applied Sciences represented 90,5% . These were results derived according to estimations made through the Citation Index data bases of the ISI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important assumption in the statistical analysis of the financial market effects of the central bank’s large scale asset purchase program is that the "long-term debt stock variables were exogenous to term premia". We test this assumption for a small open economy in a currency union over the period 2000M3 to 2015M10, via the determinants of short- term financing relative to long-term financing. Empirical estimations indicate that the maturity composition of debt does not respond to the level of interest rate or to the term structure. These findings suggest a lower adherence to the cost minimization mandate of debt management. However, we find that volatility and relative market size respectively decrease and increase short-term financing relative to long-term financing, while it decreases with an increase in government indebtedness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper confirms the importance of the financial systems behaviour conditions to the credit channel of monetary policy in the entire European Union (EU). It uses panel fixed- effect estimations and quarterly data for 26 EU countries for the period from Q1 1999 to Q3 2006 in an adaptation of the Bernanke and Blinder (1988) model. The findings also reveal the high degree of foreign dependence and indebtedness of the EU banking institutions and their similar reactions to the macroeconomic and the monetary policy environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Licensed under a Creative Commons Attribution 4.0 International License.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Posidonia oceanica is a Mediterranean endemic seagrass species that forms meadows covering ca. 2.5–4.5 millions of hectares, representing ca.25 % of the infralittoral and shallow circalittoral (down to 50m) bottoms of the Mediterranean. This seagrass is considered a habitat-engineer species and provides an elevated number of ecosystem services. In addition the Marine Strategy Framework Directive (MSFD, 2008/56/EC) includes seagrass like elements to evaluate the “Good Environmental Status” of the European coasts. Information about their phenological characteristic and structure of the meadows is needed for indicator estimations in order to establish their conservation status. The studied meadows are located in the westernmost limit of the P. oceanica distribution (North-western Alboran Sea) in the vecinity of the Strait of Gibraltar, an Atlantic-Mediterranean water transition area. Four sites were selected from East to West: Paraje Natural de Acantilados de Maro-Cerro Gordo (hereafter Maro), Special Area of Conservation “Calahonda” (hereafter Calahonda), Site of Community Importance Estepona (hereafter Estepona) and Punta Chullera (hereafter Chullera) where P. oceanica present their westernmost meadows. Phenological data were recorded from mid November to mid December in P. oceanica patches located at 2 – 3 m depth. At each site three types of patches (patch area <1m2, small patches; 1-2 m2, medium patches and >2 m2, large patches) were sampled. At each patch and site, 3 quadrants of 45 x 45 cm were sampled for shoot and inflorescences density measurements. In each quadrant, 10 random shoots were sampled for shoot morphology (shoot height and number of leaves). Shoot and inflorescences densities were standardized to squared meters. All the studied P. oceanica meadows develop on rocks and they present a fragmented structure with a coverage ranging between ca. 45% in Calahonda and Estepona and ca. 31% in Maro. The meadows of Chullera are reduced to a few small - medium patches with areas ranging between 0.5-1.5 m2 (Fig. 1). The meadows of Chullera and Estepona presented similar values of shoot density (ca. 752 – 662 shoots m-2, respectively) and leaf height (ca. 25 cm). Similarly, the Calahonda and Maro meadows also showed similar values of shoot density (ca. 510 – 550 shoots m-2, respectively) but displaying lower values than those of sites located closer to the Strait of Gibraltar. Regarding patch sizes and leaf height, the longest leaves (ca. 25 cm) were found in medium and large patches, but the number of leaves per shoot were higher in the small and the medium size patches (ca. 6.3 leaves per shoot). Flowering was only detected at the Calahonda meadows with maximum values of ca. 330 inflorescences m-2 (115.2 ± 98.2 inflorescences m-2, n= 9; mean ± SD) (Fig.1). Inflorescence density was not significant different among patches of different sizes. In the Alboran Sea and unlike the studied meadows, extensive beds of P. oceanica occur at the National Park of Cabo de Gata (northeastern Alboran Sea), but from east to west (Strait of Gibraltar), meadows are gradually fragmenting and their depth range decrease from 30m to 2m depth between Cabo de Gata and Chullera, respectively. Probably, the Atlantic influence and the characteristic oceanographic conditions of the Alboran Sea (i.e., higher turbidity, higher water turbulence) represent a developmental limiting factor for P. oceanica at higher depths. Similarities between the meadows located closer to Strait of Gibraltar (Chullera and Estepona) were detected as well as between those more distant (Calahonda and Maro). The first ones showed higher values of shoot densities and leaf heights than the formers, which could be relating to the higher hydrodynamic exposure found at Chullera and Estepona meadows. Regarding flowering events, sexual reproduction in P. oceanica is not common in different locations of the Mediterranean Sea. The available information seems to indicate that flowering represent an irregular event and it is related to high seawater temperature. In fact, the flowering episodes that occurred in Calahonda in November 2015, match with the warmest year ever recorded. This is the third flowering event registered in these meadows located close to the westernmost distributional limit of P. oceanica (Málaga, Alboran Sea), which could indicates that these meadows presents a healthy status. Furthermore, the absence of significant differences in relation to inflorescence density between patches of different sizes may be indicating that the fragmentation does not necessarily influence on the flowering of this seagrass species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart grid constrained optimal control is a complex issue due to the constant growth of grid complexity and the large volume of data available as input to smart device control. In this context, traditional centralized control paradigms may suffer in terms of the timeliness of optimization results due to the volume of data to be processed and the delayed asynchronous nature of the data transmission. To address these limits of centralized control, this paper presents a coordinated, distributed algorithm based on distributed, local controllers and a central coordinator for exchanging summarized global state information. The proposed model for exchanging global state information is resistant to fluctuations caused by the inherent interdependence between local controllers, and is robust to delays in information exchange. In addition, the algorithm features iterative refinement of local state estimations that is able to improve local controller ability to operate within network constraints. Application of the proposed coordinated, distributed algorithm through simulation shows its effectiveness in optimizing a global goal within a complex distribution system operating under constraints, while ensuring network operation stability under varying levels of information exchange delay, and with a range of network sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless sensor networks are often deployed in large numbers, over a large geographical region, in order to monitor the phenomena of interest. Sensors used in the sensor networks often suffer from random or systematic errors such as drift and bias. Even if they are calibrated at the time of deployment, they tend to drift as time progresses. Consequently, the progressive manual calibration of such a large-scale sensor network becomes impossible in practice. In this article, we address this challenge by proposing a collaborative framework to automatically detect and correct the drift in order to keep the data collected from these networks reliable. We propose a novel scheme that uses geospatial estimation-based interpolation techniques on measurements from neighboring sensors to collaboratively predict the value of phenomenon being observed. The predicted values are then used iteratively to correct the sensor drift by means of a Kalman filter. Our scheme can be implemented in a centralized as well as distributed manner to detect and correct the drift generated in the sensors. For centralized implementation of our scheme, we compare several krigingand nonkriging-based geospatial estimation techniques in combination with the Kalman filter, and show the superiority of the kriging-based methods in detecting and correcting the drift. To demonstrate the applicability of our distributed approach on a real world application scenario, we implement our algorithm on a network consisting of Wireless Sensor Network (WSN) hardware. We further evaluate single as well as multiple drifting sensor scenarios to show the effectiveness of our algorithm for detecting and correcting drift. Further, we address the issue of high power usage for data transmission among neighboring nodes leading to low network lifetime for the distributed approach by proposing two power saving schemes. Moreover, we compare our algorithm with a blind calibration scheme in the literature and demonstrate its superiority in detecting both linear and nonlinear drifts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To systematically review cost of illness studies for schizophrenia (SC), epilepsy (EP) and type 2 diabetes mellitus (T2DM) and explore the transferability of direct medical cost across countries.

METHODS: A comprehensive literature search was performed to yield studies that estimated direct medical costs. A generalized linear model (GLM) with gamma distribution and log link was utilized to explore the variation in costs that accounted by the included factors. Both parametric (Random-effects model) and non-parametric (Boot-strapping) meta-analyses were performed to pool the converted raw cost data (expressed as percentage of GDP/capita of the country where the study was conducted).

RESULTS: In total, 93 articles were included (40 studies were for T2DM, 34 studies for EP and 19 studies for SC). Significant variances were detected inter- and intra-disease classes for the direct medical costs. Multivariate analysis identified that GDP/capita (p<0.05) was a significant factor contributing to the large variance in the cost results. Bootstrapping meta-analysis generated more conservative estimations with slightly wider 95% confidence intervals (CI) than the parametric meta-analysis, yielding a mean (95%CI) of 16.43% (11.32, 21.54) for T2DM, 36.17% (22.34, 50.00) for SC and 10.49% (7.86, 13.41) for EP.

CONCLUSIONS: Converting the raw cost data into percentage of GDP/capita of individual country was demonstrated to be a feasible approach to transfer the direct medical cost across countries. The approach from our study to obtain an estimated direct cost value along with the size of specific disease population from each jurisdiction could be used for a quick check on the economic burden of particular disease for countries without such data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: We have recently demonstrated that an obese-years construct is a better predictor of the risk of diabetes than the severity of body weight alone. However, these risk estimates were derived from a population cohort study initiated in 1948 that might not apply to the current population.

OBJECTIVE: To validate an obese-years construct in estimating the risk of type-2 diabetes in a more contemporary cohort study.

DESIGN: A total of 5,132 participants of the Framingham Offspring Study, initiated in 1972, were followed up for 45 years. Body mass index (BMI) above 29 kg/m(2) was multiplied by the number of years lived with obesity at that BMI to define the number of obese-years. Time-dependent Cox regression was used to explore the association.

RESULTS: The risk of type-2 diabetes increased significantly with increase in obese-years. Adjusted hazard ratios increased by 6% (95% CI: 5-7%) per additional 10 points of obese-years. This ratio was observed to be similar in both men and women, but was 4% higher in current smokers than in never/ex-smokers. The Akaike Information Criterion confirmed that the Cox regression model with the obese-years construct was a stronger predictor of the risk of diabetes than a model including either BMI or the duration of obesity alone.

CONCLUSIONS: In a contemporary cohort population, it was confirmed that the obese-years construct is strongly associated with an increased risk of type-2 diabetes. This suggests that both severity and the duration of obesity should be considered in future estimations of the burden of disease associated with obesity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forensic entomology involves the use of insects and other arthropods to estimate the minimum time elapsed since death, referred to as minimum postmortem interval (minPMI). This is based on the assemblage of insects found in association with remains, and most often, the time required for development of the first colonizing insects to develop to their size/life stage at time of collection. This process involves the accumulation of appropriate data for the development of the species of insect at a variety of relevant temperatures and consideration of the other biotic and abiotic factors that may affect developmental rate. This review considers the approaches to the estimation of minPMI, focusing largely on the age estimation of specimens collected from remains and the limitations that accompany entomology-based PMI estimations. Recent advances and newly developed techniques in the field are reviewed in regard to future potential.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La fraction d’éjection du ventricule gauche est un excellent marqueur de la fonction cardiaque. Plusieurs techniques invasives ou non sont utilisées pour son calcul : l’angiographie, l’échocardiographie, la résonnance magnétique nucléaire cardiaque, le scanner cardiaque, la ventriculographie radioisotopique et l’étude de perfusion myocardique en médecine nucléaire. Plus de 40 ans de publications scientifiques encensent la ventriculographie radioisotopique pour sa rapidité d’exécution, sa disponibilité, son faible coût et sa reproductibilité intra-observateur et inter-observateur. La fraction d’éjection du ventricule gauche a été calculée chez 47 patients à deux reprises, par deux technologues, sur deux acquisitions distinctes selon trois méthodes : manuelle, automatique et semi-automatique. Les méthodes automatique et semi-automatique montrent dans l’ensemble une meilleure reproductibilité, une plus petite erreur standard de mesure et une plus petite différence minimale détectable. La méthode manuelle quant à elle fournit un résultat systématiquement et significativement inférieur aux deux autres méthodes. C’est la seule technique qui a montré une différence significative lors de l’analyse intra-observateur. Son erreur standard de mesure est de 40 à 50 % plus importante qu’avec les autres techniques, tout comme l’est sa différence minimale détectable. Bien que les trois méthodes soient d’excellentes techniques reproductibles pour l’évaluation de la fraction d’éjection du ventricule gauche, les estimations de la fiabilité des méthodes automatique et semi-automatique sont supérieures à celles de la méthode manuelle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sustainable development has only recently started examining the existing infrastructure, and a key aspect of this is hazard mitigation. To examine buildings under a sustainable perspective requires an understanding of a building's life-cycle environmental costs, including the consideration of associated environmental impacts induced by earthquake damage. Damage repair costs lead to additional material and energy consumption, leading to harmful environmental impacts. Merging results obtained from a seismic evaluation and life-cycle analysis for buildings will give a novel outlook on sustainable design decisions. To evaluate the environmental impacts caused by buildings, long-term impacts accrued throughout a building's lifetime and impacts associated with damage repair need to be quantified. A method and literature review for completing this examination has been developed and is discussed. Using software Athena and HAZUS-MH, this study evaluated the performance of steel and concrete buildings considering their life-cycle assessments and earthquake resistance. It was determined that code design-level greatly effects a building repair and damage estimations. This study presented two case study buildings and found specific results that were obtained using several premade assumptions. Future research recommendations were provided to make this methodology more useful in real-world applications. Examining cost and environmental impacts that a building has through, a cradle-to-grave analysis and seismic damage assessment will help reduce material consumption and construction activities from taking place before and after an earthquake event happens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade advances and innovations from Silicon Photonics technology were observed in the telecommunications and computing industries. This technology which employs Silicon as an optical medium, relies on current CMOS micro-electronics fabrication processes to enable medium scale integration of many nano-photonic devices to produce photonic integrated circuitry. However, other fields of research such as optical sensor processing can benefit from silicon photonics technology, specially in sensors where the physical measurement is wavelength encoded. In this research work, we present a design and application of a thermally tuned silicon photonic device as an optical sensor interrogator. The main device is a micro-ring resonator filter of 10 $\mu m$ of diameter. A photonic design toolkit was developed based on open source software from the research community. With those tools it was possible to estimate the resonance and spectral characteristics of the filter. From the obtained design parameters, a 7.8 x 3.8 mm optical chip was fabricated using standard micro-photonics techniques. In order to tune a ring resonance, Nichrome micro-heaters were fabricated on top of the device. Some fabricated devices were systematically characterized and their tuning response were determined. From measurements, a ring resonator with a free-spectral-range of 18.4 nm and with a bandwidth of 0.14 nm was obtained. Using just 5 mA it was possible to tune the device resonance up to 3 nm. In order to apply our device as a sensor interrogator in this research, a model of wavelength estimation using time interval between peaks measurement technique was developed and simulations were carried out to assess its performance. To test the technique, an experiment using a Fiber Bragg grating optical sensor was set, and estimations of the wavelength shift of this sensor due to axial strains yield an error within 22 pm compared to measurements from spectrum analyzer. Results from this study implies that signals from FBG sensors can be processed with good accuracy using a micro-ring device with the advantage of ts compact size, scalability and versatility. Additionally, the system also has additional applications such as processing optical wavelength shifts from integrated photonic sensors and to be able to track resonances from laser sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective decision making uses various databases including both micro and macro level datasets. In many cases it is a big challenge to ensure the consistency of the two levels. Different types of problems can occur and several methods can be used to solve them. The paper concentrates on the input alignment of the households’ income for microsimulation, which means refers to improving the elements of a micro data survey (EU-SILC) by using macro data from administrative sources. We use a combined micro-macro model called ECONS-TAX for this improvement. We also produced model projections until 2015 which is important because the official EU-SILC micro database will only be available in Hungary in the summer of 2017. The paper presents our estimations about the dynamics of income elements and the changes in income inequalities. Results show that the aligned data provides a different level of income inequality, but does not affect the direction of change from year to year. However, when we analyzed policy change, the use of aligned data caused larger differences both in income levels and in their dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La fraction d’éjection du ventricule gauche est un excellent marqueur de la fonction cardiaque. Plusieurs techniques invasives ou non sont utilisées pour son calcul : l’angiographie, l’échocardiographie, la résonnance magnétique nucléaire cardiaque, le scanner cardiaque, la ventriculographie radioisotopique et l’étude de perfusion myocardique en médecine nucléaire. Plus de 40 ans de publications scientifiques encensent la ventriculographie radioisotopique pour sa rapidité d’exécution, sa disponibilité, son faible coût et sa reproductibilité intra-observateur et inter-observateur. La fraction d’éjection du ventricule gauche a été calculée chez 47 patients à deux reprises, par deux technologues, sur deux acquisitions distinctes selon trois méthodes : manuelle, automatique et semi-automatique. Les méthodes automatique et semi-automatique montrent dans l’ensemble une meilleure reproductibilité, une plus petite erreur standard de mesure et une plus petite différence minimale détectable. La méthode manuelle quant à elle fournit un résultat systématiquement et significativement inférieur aux deux autres méthodes. C’est la seule technique qui a montré une différence significative lors de l’analyse intra-observateur. Son erreur standard de mesure est de 40 à 50 % plus importante qu’avec les autres techniques, tout comme l’est sa différence minimale détectable. Bien que les trois méthodes soient d’excellentes techniques reproductibles pour l’évaluation de la fraction d’éjection du ventricule gauche, les estimations de la fiabilité des méthodes automatique et semi-automatique sont supérieures à celles de la méthode manuelle.