902 resultados para TIME-TREND ANALYSIS
Resumo:
We use advanced statistical tools of time-series analysis to characterize the dynamical complexity of the transition to optical wave turbulence in a fiber laser. Ordinal analysis and the horizontal visibility graph applied to the experimentally measured laser output intensity reveal the presence of temporal correlations during the transition from the laminar to the turbulent lasing regimes. Both methods unveil coherent structures with well-defined time scales and strong correlations both, in the timing of the laser pulses and in their peak intensities. Our approach is generic and may be used in other complex systems that undergo similar transitions involving the generation of extreme fluctuations.
Resumo:
We review some recent results on the application of distributed Raman amplification schemes, including ultralong lasers, to the extension of the operating range and contrast in Brillouin optical time domain analysis (BOTDA) distributed sensing systems. © 2010 IEEE.
The Long-Term impact of Business Support? - Exploring the Role of Evaluation Timing using Micro Data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
The long-term impact of business support? - Exploring the role of evaluation timing using micro data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
Resumo:
The origins of population dynamics depend on interplay between abiotic and biotic factors; the relative importance of each changing across space and time. Predation is a central feature of ecological communities that removes individuals (consumption) and alters prey traits (non-consumptive). Resource quality mitigates non-consumptive predator effects by stimulating growth and reproduction. Disturbance resets predator-prey interactions by removing both. I integrate experiments, time-series analysis, and performance trials to examine the relative importance of these on the population dynamics of a snail species by studying a variety of their traits. A review of ninety-three published articles revealed that snail abundance was much less in the Everglades and similar ecosystems compared to all other freshwater ecosystems considered. Separating consumptive from non-consumptive (cues) predator effects at different phosphorous levels with an experiment determined that phosphorous stimulated, but predator cues inhibited snail growth (34% vs. 23%), activity (38% vs. 53%), and reproductive effort (99% vs. 90%) compared to controls. Cues induced taller shells and smaller openings and moved to refugia where they reduced periphyton by 8%. Consumptive predator effects were minor in comparison. In a reciprocal transplant cage experiment along a predator cue and phosphorous gradient created by a canal, snails grew 10% faster and produced 37% more eggs far from the canal (fewer cues) when fed phosphorous-enriched periphyton from near the canal. Time-series analysis at four sites and predator performance trials reveal that phosphorous-enriched regions support larger snail populations, seasonal drying removes snails at all sites, crayfish negatively affect populations in enriched regions, and molluscivorous fish consume snails in the wet season. Combining these studies reveals interplay between resources, predators, and seasonality that limit snail populations in the Everglades and lead to their low abundance compared to other freshwater ecosystems. Resource quality is emerging as the critical factor because improving resources profoundly improved growth and reproduction; seasonal drying and predation become important at times and places. This work contributes to the general understanding in ecology of the relative importance of different factors that structure populations and provides evidence that bolsters monitoring efforts to assess the Comprehensive Everglades Restoration Plan that show phosphorous enrichment is a major driver of ecosystem change.
Resumo:
Bankruptcy prediction has been a fruitful area of research. Univariate analysis and discriminant analysis were the first methodologies used. While they perform relatively well at correctly classifying bankrupt and nonbankrupt firms, their predictive ability has come into question over time. Univariate analysis lacks the big picture that financial distress entails. Multivariate discriminant analysis requires stringent assumptions that are violated when dealing with accounting ratios and market variables. This has led to the use of more complex models such as neural networks. While the accuracy of the predictions has improved with the use of more technical models, there is still an important point missing. Accounting ratios are the usual discriminating variables used in bankruptcy prediction. However, accounting ratios are backward-looking variables. At best, they are a current snapshot of the firm. Market variables are forward-looking variables. They are determined by discounting future outcomes. Microstructure variables, such as the bid-ask spread, also contain important information. Insiders are privy to more information that the retail investor, so if any financial distress is looming, the insiders should know before the general public. Therefore, any model in bankruptcy prediction should include market and microstructure variables. That is the focus of this dissertation. The traditional models and the newer, more technical models were tested and compared to the previous literature by employing accounting ratios, market variables, and microstructure variables. Our findings suggest that the more technical models are preferable, and that a mix of accounting and market variables are best at correctly classifying and predicting bankrupt firms. Multi-layer perceptron appears to be the most accurate model following the results. The set of best discriminating variables includes price, standard deviation of price, the bid-ask spread, net income to sale, working capital to total assets, and current liabilities to total assets.
Resumo:
We organized an international campaign to observe the blazar 0716+714 in the optical band. The observations took place from February 24, 2009 to February 26, 2009. The global campaign was carried out by observers from more that sixteen countries and resulted in an extended light curve nearly seventy-eight hours long. The analysis and the modeling of this light curve form the main work of this dissertation project. In the first part of this work, we present the time series and noise analyses of the data. The time series analysis utilizes discrete Fourier transform and wavelet analysis routines to search for periods in the light curve. We then present results of the noise analysis which is based on the idea that each microvariability curve is the realization of the same underlying stochastic noise processes in the blazar jet. ^ Neither reoccuring periods nor random noise can successfully explain the observed optical fluctuations. Hence in the second part, we propose and develop a new model to account for the microvariability we see in blazar 0716+714. We propose that the microvariability is due to the emission from turbulent regions in the jet that are energized by the passage of relativistic shocks. Emission from each turbulent cell forms a pulse of emission, and when convolved with other pulses, yields the observed light curve. We use the model to obtain estimates of the physical parameters of the emission regions in the jet.^
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
Hospitality managers have a number of methods available to them to enhance employee productivity. The author discusses five major concepts that can lead to successful results in the hospitality industry.
Resumo:
Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.
Resumo:
The work on CERP monitoring item 3.1.3.5 (Marl prairie/slough gradients) is being conducted by Florida International University (Dr Michael Ross, Project Leader), with Everglades National Park (Dr. Craig Smith) providing administrative support and technical consultation. As of January 2006 the funds transferred by ACOE to ENP, and subsequently to FIU, have been entirely expended or encumbered in salaries or wages. The project work for 2005 started rather late in the fiscal year, but ultimately accomplished the Year 1 goals of securing a permit to conduct the research in Everglades National Park, finalizing a detailed scope of work, and sampling marsh sites which are most easily accessed during the wet season. 46 plots were sampled in detail, and a preliminary vegetation classification distinguished three groups among these sites (Sawgrass marsh, sawgrass and other, and slough) which may be arranged roughly along a hydrologic gradient from least to most persistently inundated . We also made coarser observations of vegetation type at 5-m intervals along 2 transects totaling ~ 5 km. When these data were compared with similar observations made in 1998-99, it appeared that vegetation in the western portion of Northeast Shark Slough (immediately east of the L-67 extension) had shifted toward a more hydric type during the last 6 years, while vegetation further east was unchanged in this respect. Because this classification and trend analysis is based on a small fraction of the data set that will be available after the first cycle of sampling (3 years from now), the results should not be interpreted too expansively. However, they do demonstrate the potential for gaining a more comprehensive view of marsh vegetation structure and dynamics in the Everglades, and will provide a sound basis for adaptive management.
Resumo:
The prevalence of waterpipe smoking exceeds that of cigarettes among adolescents in the Middle East where waterpipe is believed as less harmful, less addictive and can be a safer alternative to cigarettes. This dissertation tested the gateway hypothesis that waterpipe can provide a bridge to initiate cigarette smoking, identified the predictors of cigarette smoking progression, and identified predictors of waterpipe smoking progression among a school-based sample of Jordanian adolescents (mean age ± SD) (12.7 ±0.61) years at baseline. Data for this research have been drawn from Irbid Longitudinal Study of smoking behavior, Jordan (2008-2011). The grouped-time survival analysis showed that waterpipe smoking was associated with a higher risk of cigarette smoking initiation compared to never smokers (P < 0.001) and this association was dose dependent (P < 0.001). Predictors of cigarette smoking progression were peer smoking and attending public schools for boys, siblings’ smoking for girls, and the urge to smoke for both genders. Predictors of waterpipe smoking progression were enrollment in public schools, frequent physical activity, and low refusal self-efficacy for boys, ever smoking cigarettes, friends’ and siblings’ waterpipe smoking for girls. Awareness of harms of waterpipe among boys and seeing warning labels on the tobacco packs by girls were protective against waterpipe smoking progression. In Conclusion, waterpipe can serve as a gateway to cigarette smoking initiation among adolescents. Waterpipe and cigarette smoking progressions among initiators were solely family-related among girls, and mainly peer-related among boys. The unique gender differences for both cigarette and waterpipe smoking among Jordanian adolescents in Irbid call for cultural and gender-specific smoking prevention interventions to prevent the progression of smoking among initiators.
Resumo:
Analogous to sunspots and solar photospheric faculae, which visibility is modulated by stellar rotation, stellar active regions consist of cool spots and bright faculae caused by the magnetic field of the star. Such starspots are now well established as major tracers used to estimate the stellar rotation period, but their dynamic behavior may also be used to analyze other relevant phenomena such as the presence of magnetic activity and its cycles. To calculate the stellar rotation period, identify the presence of active regions and investigate if the star exhibits or not differential rotation, we apply two methods: a wavelet analysis and a spot model. The wavelet procedure is also applied here to study pulsation in order to identify specific signatures of this particular stellar variability for different types of pulsating variable stars. The wavelet transform has been used as a powerful tool for treating several problems in astrophysics. In this work, we show that the time-frequency analysis of stellar light curves using the wavelet transform is a practical tool for identifying rotation, magnetic activity, and pulsation signatures. We present the wavelet spectral composition and multiscale variations of the time series for four classes of stars: targets dominated by magnetic activity, stars with transiting planets, those with binary transits, and pulsating stars. We applied the Morlet wavelet (6th order), which offers high time and frequency resolution. By applying the wavelet transform to the signal, we obtain the wavelet local and global power spectra. The first is interpreted as energy distribution of the signal in time-frequency space, and the second is obtained by time integration of the local map. Since the wavelet transform is a useful mathematical tool for nonstationary signals, this technique applied to Kepler and CoRoT light curves allows us to clearly identify particular signatures for different phenomena. In particular, patterns were identified for the temporal evolution of the rotation period and other periodicity due to active regions affecting these light curves. In addition, a beat-pattern vii signature in the local wavelet map of pulsating stars over the entire time span was also detected. The second method is based on starspots detection during transits of an extrasolar planet orbiting its host star. As a planet eclipses its parent star, we can detect physical phenomena on the surface of the star. If a dark spot on the disk of the star is partially or totally eclipsed, the integrated stellar luminosity will increase slightly. By analyzing the transit light curve it is possible to infer the physical properties of starspots, such as size, intensity, position and temperature. By detecting the same spot on consecutive transits, it is possible to obtain additional information such as the stellar rotation period in the planetary transit latitude, differential rotation, and magnetic activity cycles. Transit observations of CoRoT-18 and Kepler-17 were used to implement this model.
Resumo:
This master's thesis aims to analyze the activity of the operators in a control room of the processes of production on-shore petroleum, with a focus on sociotechnical restrictions that interfere in the decision-making process and the actions of operators and therefore, the strategies (individual and collective) to regulate and maintain the operator action required and the safety of the system, together. The activity in focus involves the supervision and control of the production of thousands of barrels of oil/day in a complex and dispersed production’s structures built in an extension of 80 km. This operational framework highlights the importance of this activity for the fulfilment of the targets local and corporate efficiency, good management of the environment, health and safety of operators. This is an exploratory research and in the field, which uses the methodology of Ergonomic Analysis of the Work, composed of observational techniques and interactional, having as locus control room of the processes of production on-shore oil of an oil company. The population of this research is formed by operators in the control room of an Brazilian oil company. The results showed that the supervisory activity and control of the superheated steam injection is an complex context, demands greater attention, concentration, calculations, comparisons, trend analysis and decision making. The activity is collectively constructed between the control room operator, field operator and the supplier of steam. The research showed that the processes of communication and collaboration between the control room , fields and support staff are the key elements of this activity. The study shows that the operators have the autonomy and the elements necessary for work; and that there is continuous investments to improve the technology used and that the operators report sleep disturbances as a result of chronic exposure to night work. The study contributed with proposals for transformation of this activity: with regard to the installation of a area reserved for food in control room, the update the screens of the supervisory current operating condition, the periodic visits by room operators in the field, standardization of production reports, development assistance and standardization of nomenclature of controlling stations steam systems, to improve the conditions of realization of the activity, improve the quality of products produced by operators and contribute to reduce the possibility of slips or shifts in the activity.
Resumo:
This work proposes a modified control chart incorporating concepts of time series analysis. Specifically, we considerer Gaussian mixed transition distribution (GMTD) models. The GMTD models are a more general class than the autorregressive (AR) family, in the sense that the autocorrelated processes may present flat stretches, bursts or outliers. In this scenario traditional Shewhart charts are no longer appropriate tools to monitoring such processes. Therefore, Vasilopoulos and Stamboulis (1978) proposed a modified version of those charts, considering proper control limits based on autocorrelated processes. In order to evaluate the efficiency of the proposed technique a comparison with a traditional Shewhart chart (which ignores the autocorrelation structure of the process), a AR(1) Shewhart control chart and a GMTD Shewhart control chart was made. An analytical expression for the process variance, as well as control limits were developed for a particular GMTD model. The ARL was used as a criteria to measure the efficiency of control charts. The comparison was made based on a series generated according to a GMTD model. The results point to the direction that the modified Shewhart GMTD charts have a better performance than the AR(1) Shewhart and the traditional Shewhart.