882 resultados para mixed stock analysis
Resumo:
Transition Year (TY) has been a feature of the Irish Education landscape for 39 years. Work experience (WE) has become a key component of TY. WE is defined as a module of between five and fifteen days duration where students engage in a work placement in the broader community. It places a major emphasis on building relationships between schools and their external communities and concomitantly between students and their potential future employers. Yet, the idea that participation in a TY work experience programme could facilitate an increased awareness of potential careers has drawn little attention from the research community. This research examines the influence WE has on the subsequent subjects choices made by students along with the effects of that experience on the students’ identities and emerging vocational identities. Socio-cultural Learning Theory and Occupational Choice Theory frame the overall study. A mixed methods approach to data collection was adopted through the administration of 323 quantitative questionnaires and 32 individual semi-structured interviews in three secondary schools. The analysis of the data was conducted using a grounded theory approach. The findings from the research show that WE makes a significant contribution to the students’ sense of agency in their own lives. It facilitates the otherwise complex process of subject choice, motivates students to work harder in their senior cycle, introduces them to the concepts of active, experience-based and self-directed learning, while boosting their self-confidence and nurturing the emergence of their personal and vocational identities. This research is a gateway to further study in this field. It also has wide reaching implications for students, teachers, school authorities, parents and policy makers regarding teaching and learning in our schools and the value of learning beyond the walls of the classroom.
Resumo:
To investigate the symptom burden experiences of individuals with inflammatory bowel disease (IBD). An explanatory sequential mixed methods study was conducted. A cross-sectional, correlational survey was first undertaken. Symptom burden was measured using a modified disease specific version of the Memorial Symptom Assessment Scale, which was administered to a consecutive sample of individuals with IBD (n = 247) at an IBD Outpatients department in one urban teaching hospital in Ireland. Disease activity was determined using clinical disease activity indices, which were completed by the consulting physician. A sequential qualitative, descriptive study was then conducted aimed at explaining noteworthy quantitative findings. A criterion-related purposeful sample of seven participants from the quantitative study was recruited. Semi-structured face to face interviews were conducted using an interview guide and data were analysed using content analysis. Findings revealed that participants experienced a median of 10 symptoms during the last week, however as many as 16 symptoms were experienced during active disease. The most burdensome symptoms were lack of energy, bowel urgency, diarrhoea, feeling bloated, flatulence and worry. Total symptom burden was found to be low with a mean score of 0.56 identified out of a possible range from 0 to 4. Participants with active disease (M = 0.81, SD = 0.48; n = 68) had almost double mean total symptom burden scores than participants with inactive disease (M = 0.46, SD = 0.43; n = 166) (p < 0.001). Mean total psychological symptom burden was found to be significantly greater than mean total physical symptom burden (rho = 0.73, n = 247, p < 0.001). Self-reported disease control, gender, number of flare ups in the last two years, and smoking status was found to be significant predictors of total symptom burden, with self-reported disease control identified as the strongest predictor. Qualitative data revealed tiredness, pain, bowel symptoms, worry and fear as being burdensome. Furthermore, symptom burden experiences were described in terms of its impact on restricting aspects of daily activities, which accumulated into restrictions on general life events. Psychological symptom burden was revealed as more problematic than physical symptom burden due to its constant nature, with physical and psychological symptoms described to occur in a cyclical manner. Participants revealed that disease control was evaluated not only in terms of symptoms, but also in terms of their abilities to control the impact of symptoms on their lives. This study highlights the considerable number of symptoms and the most burdensome symptoms experienced by individuals with IBD, both during active and inactive disease. This study has important implications on symptom assessment in terms of the need to encompass both physical and psychological symptoms. In addition, greater attention needs to be placed on psychological aspects of IBD care.
Resumo:
This study explores the experiences of stress and burnout in Irish second level teachers and examines the contribution of a number of individual, environmental and health factors in burnout development. As no such study has previously been carried out with this sample, a mixed-methods approach was adopted in order to comprehensively investigate the subject matter. Teaching has consistently been identified as a particularly stressful occupation and research investigating its development is of great importance in developing measures to address the problem. The first phase of study involved the use of focus groups conducted with a total of 20 second-level teachers from 11 different schools in the greater Cork city area. Findings suggest that teachers experience a variety of stressors – in class, in the staff room and outside of school. The second phase of study employed a survey to examine the factors associated with burnout. Analysis of 192 responses suggested that burnout results from a combination of demographic, personality, environmental and coping factors. Burnout was also found to be associated with a number of physical symptoms, particularly trouble sleeping and fatigue. Findings suggest that interventions designed to reduce burnout must reflect the complexity of the problem and its development. Based on the research findings, interventions that combine individual and organisational approaches should provide the optimal chance of effectively tackling burnout.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.
Resumo:
Gaussian factor models have proven widely useful for parsimoniously characterizing dependence in multivariate data. There is a rich literature on their extension to mixed categorical and continuous variables, using latent Gaussian variables or through generalized latent trait models acommodating measurements in the exponential family. However, when generalizing to non-Gaussian measured variables the latent variables typically influence both the dependence structure and the form of the marginal distributions, complicating interpretation and introducing artifacts. To address this problem we propose a novel class of Bayesian Gaussian copula factor models which decouple the latent factors from the marginal distributions. A semiparametric specification for the marginals based on the extended rank likelihood yields straightforward implementation and substantial computational gains. We provide new theoretical and empirical justifications for using this likelihood in Bayesian inference. We propose new default priors for the factor loadings and develop efficient parameter-expanded Gibbs sampling for posterior computation. The methods are evaluated through simulations and applied to a dataset in political science. The models in this paper are implemented in the R package bfa.
Resumo:
In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).
Resumo:
PREMISE OF THE STUDY: We investigated the origins of 252 Southern Appalachian woody species representing 158 clades to analyze larger patterns of biogeographic connectivity around the northern hemisphere. We tested biogeographic hypotheses regarding the timing of species disjunctions to eastern Asia and among areas of North America. METHODS: We delimited species into biogeographically informative clades, compiled sister-area data, and generated graphic representations of area connections across clades. We calculated taxon diversity within clades and plotted divergence times. KEY RESULTS: Of the total taxon diversity, 45% were distributed among 25 North American endemic clades. Sister taxa within eastern North America and eastern Asia were proportionally equal in frequency, accounting for over 50% of the sister-area connections. At increasing phylogenetic depth, connections to the Old World dominated. Divergence times for 65 clades with intercontinental disjunctions were continuous, whereas 11 intracontinental disjunctions to western North America and nine to eastern Mexico were temporally congruent. CONCLUSIONS: Over one third of the clades have likely undergone speciation within the region of eastern North America. The biogeographic pattern for the region is asymmetric, consisting of mostly mixed-aged, low-diversity clades connecting to the Old World, and a minority of New World clades. Divergence time data suggest that climate change in the Late Miocene to Early Pliocene generated disjunct patterns within North America. Continuous splitting times during the last 45 million years support the hypothesis that widespread distributions formed repeatedly during favorable periods, with serial cooling trends producing pseudocongruent area disjunctions between eastern North America and eastern Asia.
Resumo:
© Institute of Mathematical Statistics, 2014.Motivated by recent findings in the field of consumer science, this paper evaluates the causal effect of debit cards on household consumption using population-based data from the Italy Survey on Household Income and Wealth (SHIW). Within the Rubin Causal Model, we focus on the estimand of population average treatment effect for the treated (PATT). We consider three existing estimators, based on regression, mixed matching and regression, propensity score weighting, and propose a new doubly-robust estimator. Semiparametric specification based on power series for the potential outcomes and the propensity score is adopted. Cross-validation is used to select the order of the power series. We conduct a simulation study to compare the performance of the estimators. The key assumptions, overlap and unconfoundedness, are systematically assessed and validated in the application. Our empirical results suggest statistically significant positive effects of debit cards on the monthly household spending in Italy.
Resumo:
This study investigates a longitudinal dataset consisting of financial and operational data from 37 listed companies listed on Vietnamese stock market, covering the period 2004-13. By performing three main types of regression analysis - pooled OLS, fixed-effect and random-effect regressions - the investigation finds mixed results on the relationships between operational scales, sources of finance and firms' performance, depending on the choice of analytical model and use of independent/dependent variables. In most situation, fixed-effect models appear to be preferable, providing for reasonably consistent results. Toward the end, the paper offers some further explanation about the obtained insights, which reflect the nature of a business environment of a transition economy and an emerging market.
Resumo:
During the 1970s and 1980s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work (Phase 1 of the study reported here: Oakley & Hiscock, 2005). Some image analysis was undertaken as a part of Phase 1. Phase 2 (this report) was to further analyse selected images. Having determined in Phase 1 that only the 35 mm photographic transparencies provided sufficient clarity to identify species and biotopes, the tows selected for analysis were ones where 35mm images had been taken. The tows selected for analysis of images were mainly in the vicinity of Plymouth and especially along the area between Rame Head and the region of the Eddystone. The 35 mm films were viewed under a binocular microscope and the taxa that could be recognised recorded in note form. Twenty-five images were selected for inclusion in the report. Almost all of the images were of level sediment seabed. Where rocks were included, it was usually unplanned and the sled was hauled before being caught or damaged. The main biotopes or biotope complexes identified were: SS.SMU.CSaMu. Circalittoral sandy mud. Extensively present between the shore and the Eddystone Reef complex and at depths of about 48 to 52 m. At one site offshore of Plymouth Sound, the turret shell Turritella communis was abundant. In some areas, this biotope had dense anemones, Mesacmaea mitchelli and (more rarely) Cerianthus lloydii. Queen scallops, Aequipecten opercularis and king scallops, Pecten maximus, were sometimes present in small numbers. Hard substratum species such as hydroids, dead mens fingers Alcyonium digitatum and the cup coral Caryophyllia smithii occurred in a few places, probably attached to shells or stones beneath the surface. South of the spoil ground off Hilsea Point at 57m depth, the sediment was muddier but is still assigned to this biotope complex. It is notable that three small sea pens, most likely Virgularia mirabilis, were seen here. SS.SMx.CMx. Circalittoral mixed sediment. Further offshore but at about the same depth as SS.SMU.CSaMu occurred, coarse gravel with some silt was present. The sediment was characterised must conspicuously by small queen scallops, Aequipecten opercularis. Peculiarly, there were ‘bundles’ of the branching bryozoan Cellaria sp. – a species normally found attached to rock. It could not be seen whether these bundles of Cellaria had been brought-together by terebellid worms but it is notable that Cellaria is recorded in historical surveys. As with many other sediments, there were occasional brittle stars, Ophiocomina nigra and Ophiura ophiura. Where sediments were muddy, the burrowing anemone Mesacmaea mitchelli was common. Where pebbles or cobbles occurred, there were attached species such as Alcyonium digitatum, Caryophyllia smithii and the fleshy bryozoan Alcyonidium diaphanum. Undescribed biotope. Although most likely a part of SS.SMx.CMx, the biotope visually dominated by a terebellid worm believed to be Thelepus cincinnatua, is worth special attention as it may be an undescribed biotope. The biotope occurred about 22 nautical miles south of the latitude of the Eddystone and in depths in excess of 70 m. SS.SCS.CCS.Blan. Branchiostoma lanceolatum in circalittoral coarse sand with shell gravel at about 48m depth and less. This habitat was the ‘classic’ ‘Eddystone Shell Gravel’ which is sampled for Branchiostoma lanceolatum. However, no Branchiostoma lanceolatum could be seen. The gravel was almost entirely bare of epibiota. There were occasional rock outcrops or cobbles which had epibiota including encrusting calcareous algae, the sea fan Eunicella verrucosa, cup corals, Caryophyllia smithii, hydroids and a sea urchin Echinus esculentus. The variety of species visible on the surface is small and therefore identification to biotope not usually possible. Historical records from sampling surveys that used grabs and dredges at the end of the 19th century and early 20th century suggest similar species present then. Illustrations of some of the infaunal communities from work in the 1920’s is included in this report to provide a context to the epifaunal photographs.
Resumo:
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock
Resumo:
The Nassau grouper, Epinephelus striatus (Bloch, 1792), is an endangered species that has been historically overexploited in numerous fisheries throughout its range in the Caribbean and tropical West Atlantic. Data relating fishery exploitation levels to stock abundance of the species are deficient, and protective regulations for the Nassau grouper are yet to be implemented in the Turks and Caicos Islands (TCI). The goal of this study was to conduct a stock assessment and evaluate the exploitation status of the Nassau grouper in the TCI. Materials and methods. Calibrated length cohort analysis was applied to published fisheries data on Nassau grouper landings in the TCI. The total lengths of Nassau groupers among the catches of spearfishers, lobster trappers, and deep sea fishers on the island of South Caicos during 2006 and 2008 were used with estimates of growth, natural mortality, and total annual landings to derive exploitation benchmarks. Results. The TCI stock experienced low to moderate fishing mortality (0.28, 0.18) and exploitation rates (0.49, 0.38) during the period of the study (2006, 2008). However, 21.2%-64.4% of all landings were reproductively immature. Spearfishing appeared to contribute most to fishing mortality relative to the use of lobster traps or hydraulic reels along bank drop-offs. Conclusion. In comparison with available fisheries data for the wider Caribbean, the results reveal the TCI as one of the remaining sites, in addition to the Bahamas, with a substantial Nassau grouper stock. In light of increasing development and tourism in the TCI, continued monitoring is essential to maintain sustainable harvesting practices.
Resumo:
Phytoplankton size structure is an important indicator of the state of the pelagic ecosystem. Stimulated by the paucity of in situ observations on size structure, and by the sampling advantages of autonomous remote platforms, new efforts are being made to infer the size-structure of the phytoplankton from oceanographic variables that may be measured at high temporal and spatial resolution, such as total chlorophyll concentration. Large-scale analysis of in situ data has revealed coherent relationships between size-fractionated chlorophyll and total chlorophyll that can be quantified using the three-component model of Brewin et al. (2010). However, there are variations surrounding these general relationships. In this paper, we first revise the three-component model using a global dataset of surface phytoplankton pigment measurements. Then, using estimates of the average irradiance in the mixed-layer, we investigate the influence of ambient light on the parameters of the three-component model. We observe significant relationships between model parameters and the average irradiance in the mixed-layer, consistent with ecological knowledge. These relationships are incorporated explicitly into the three-component model to illustrate variations in the relationship between size-structure and total chlorophyll, ensuing from variations in light availability. The new model may be used as a tool to investigate modifications in size-structure in the context of a changing climate.
Willingness to Pay for Rural Landscape Improvements: Combining Mixed Logit and Random-Effects Models
Resumo:
This paper reports the findings from a discrete-choice experiment designed to estimate the economic benefits associated with rural landscape improvements in Ireland. Using a mixed logit model, the panel nature of the dataset is exploited to retrieve willingness-to-pay values for every individual in the sample. This departs from customary approaches in which the willingness-to-pay estimates are normally expressed as measures of central tendency of an a priori distribution. Random-effects models for panel data are subsequently used to identify the determinants of the individual-specific willingness-to-pay estimates. In comparison with the standard methods used to incorporate individual-specific variables into the analysis of discrete-choice experiments, the analytical approach outlined in this paper is shown to add considerable explanatory power to the welfare estimates.