11 resultados para data-based reporting
em Aston University Research Archive
Resumo:
Background: Parkinson’s disease (PD) is an incurable neurological disease with approximately 0.3% prevalence. The hallmark symptom is gradual movement deterioration. Current scientific consensus about disease progression holds that symptoms will worsen smoothly over time unless treated. Accurate information about symptom dynamics is of critical importance to patients, caregivers, and the scientific community for the design of new treatments, clinical decision making, and individual disease management. Long-term studies characterize the typical time course of the disease as an early linear progression gradually reaching a plateau in later stages. However, symptom dynamics over durations of days to weeks remains unquantified. Currently, there is a scarcity of objective clinical information about symptom dynamics at intervals shorter than 3 months stretching over several years, but Internet-based patient self-report platforms may change this. Objective: To assess the clinical value of online self-reported PD symptom data recorded by users of the health-focused Internet social research platform PatientsLikeMe (PLM), in which patients quantify their symptoms on a regular basis on a subset of the Unified Parkinson’s Disease Ratings Scale (UPDRS). By analyzing this data, we aim for a scientific window on the nature of symptom dynamics for assessment intervals shorter than 3 months over durations of several years. Methods: Online self-reported data was validated against the gold standard Parkinson’s Disease Data and Organizing Center (PD-DOC) database, containing clinical symptom data at intervals greater than 3 months. The data were compared visually using quantile-quantile plots, and numerically using the Kolmogorov-Smirnov test. By using a simple piecewise linear trend estimation algorithm, the PLM data was smoothed to separate random fluctuations from continuous symptom dynamics. Subtracting the trends from the original data revealed random fluctuations in symptom severity. The average magnitude of fluctuations versus time since diagnosis was modeled by using a gamma generalized linear model. Results: Distributions of ages at diagnosis and UPDRS in the PLM and PD-DOC databases were broadly consistent. The PLM patients were systematically younger than the PD-DOC patients and showed increased symptom severity in the PD off state. The average fluctuation in symptoms (UPDRS Parts I and II) was 2.6 points at the time of diagnosis, rising to 5.9 points 16 years after diagnosis. This fluctuation exceeds the estimated minimal and moderate clinically important differences, respectively. Not all patients conformed to the current clinical picture of gradual, smooth changes: many patients had regimes where symptom severity varied in an unpredictable manner, or underwent large rapid changes in an otherwise more stable progression. Conclusions: This information about short-term PD symptom dynamics contributes new scientific understanding about the disease progression, currently very costly to obtain without self-administered Internet-based reporting. This understanding should have implications for the optimization of clinical trials into new treatments and for the choice of treatment decision timescales.
Resumo:
This study explores the challenges of implementing International Financial Reporting Standards (IFRS) at the organisational level. Based on interviews with experts with aggregated experience relating to the transition projects of over 170 reporting entities, this paper highlights the main challenges in delivering a successful implementation of IFRS. The findings show that the problems faced in implementation include lack of education and training, securing executive-level support, identifying and responding to the wider business-related implications of the transition, and issues with capturing the necessary information for reporting under IFRS.This paper complements the existing literature and offers a qualitative alternative to considering the transition to IFRS, offering insight into the organisational context of IFRS implementation. These insights are useful not only from a historic perspective, but also for organisations and regulators in the many jurisdictions where IFRS is permitted but not required, where more reporting entities will voluntarily move to IFRS-based reporting in the future. More broadly, they are also applicable to the challenges faced in implementing new and significantly revised IFRSs.
Resumo:
This article qualitatively analyzes the Critical Success Factors (CSFs) for Information Systems (IS) executive careers based on evidence gathered from five case studies carried out in 1997. Typical IS executive career paths are presented within a time series style and the CSFs are interpreted within a descriptive framework by synthesising the case data based on Social Cognitive Theory. The descriptive framework suggests that successful IS executive careers would most likely be achieved by well educated and experienced IS employees who have the right attitude towards both their career and work, together with good performance. They would also exhibit an ability for self-learning and to anticipate future IT uses, as well as having proficient IS management knowledge and skills while working with an appropriate organizational environment. Moreover, the framework systematically indicates the interactions between the coupling factors in the typical career development processes. This provides a benchmark for employees that are aiming at a senior IS executive career against which they can compare their own achievements and aspirations. It also raises propositions for further research on theory building.
Resumo:
A study has been made of the coalescence of secondary dispersions in beds of woven meshes. The variables investigated were superficial velocity, bed depth, mesh geometry and fibre material; the effects of presoaking the bed in the dispersed phase before operation were also considered. Equipment was design~d to generate a 0.1% phase ratio toluene in water dispersion whose mean drop size was determined using a Coulter Counter. The coalesced drops were sized by photography and a novel holographic technique was developed to evaluate the mean diameter of the effluent secondary drops. Previous models describing single phase flow in porous media are reviewed and it was found that the experimental data obtained in this study is best represented by Keller's equation which is based on a physical model similar to the internal structure of the meshes. Statistical analysis of two phase data produced a correlation, for each mesh tested, relating the pressure drop to superficial velocity and bed depth. The flow parameter evaluated from the single phase model is incorporated into a theoretical comparison of drop capture mechanisms which indicated that direct and indirect interception are predominant. The resulting equation for drop capture efficiericy is used to predict the initial, local drop capture rate in a coalescer. A mathematical description of the saturation profiles was formulated and verified by average saturation data. Based 6n the Blake-Kozeny equation, an expression is derived analytically to predict the two phase pressure drop using the parameters which characterise the saturation profiles. By specifying the local saturation at the inlet face for a given velocity, good agreement between experimental pressure drop data and the model predictions was obtained.
Resumo:
The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. They consist of tiny devices with limited processing and power capabilities, called motes that can be deployed in large numbers of useful sensing capabilities. Even though, they are flexible and easy to deploy, there are a number of considerations when it comes to their fault tolerance, conserving energy and re-programmability that need to be addressed before we draw any substantial conclusions about the effectiveness of this technology. In order to overcome their limitations, we propose a middleware solution. The proposed scheme is composed based on two main methods. The first method involves the creation of a flexible communication protocol based on technologies such as Mobile Code/Agents and Linda-like tuple spaces. In this way, every node of the wireless sensor network will produce and process data based on what is the best for it but also for the group that it belongs too. The second method incorporates the above protocol in a middleware that will aim to bridge the gap between the application layer and low level constructs such as the physical layer of the wireless sensor network. A fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort towards the deployed applications running in an energy efficient manner inside the network. The proposed scheme is evaluated through a number of trials aiming to test its merits under real time conditions and to identify its effectiveness against other similar approaches. Finally, parameters which determine the characteristics of the proposed scheme are also examined.
Resumo:
We present a data based statistical study on the effects of seasonal variations in the growth rates of the gastro-intestinal (GI) parasitic infection in livestock. The alluded growth rate is estimated through the variation in the number of eggs per gram (EPG) of faeces in animals. In accordance with earlier studies, our analysis too shows that rainfall is the dominant variable in determining EPG infection rates compared to other macro-parameters like temperature and humidity. Our statistical analysis clearly indicates an oscillatory dependence of EPG levels on rainfall fluctuations. Monsoon recorded the highest infection with a comparative increase of at least 2.5 times compared to the next most infected period (summer). A least square fit of the EPG versus rainfall data indicates an approach towards a super diffusive (i. e. root mean square displacement growing faster than the square root of the elapsed time as obtained for simple diffusion) infection growth pattern regime for low rainfall regimes (technically defined as zeroth level dependence) that gets remarkably augmented for large rainfall zones. Our analysis further indicates that for low fluctuations in temperature (true on the bulk data), EPG level saturates beyond a critical value of the rainfall, a threshold that is expected to indicate the onset of the nonlinear regime. The probability density functions (PDFs) of the EPG data show oscillatory behavior in the large rainfall regime (greater than 500 mm), the frequency of oscillation, once again, being determined by the ambient wetness (rainfall, and humidity). Data recorded over three pilot projects spanning three measures of rainfall and humidity bear testimony to the universality of this statistical argument. © 2013 Chattopadhyay and Bandyopadhyay.
Resumo:
This paper presents an analysis of whether a consumer's decision to switch from one mobile phone provider to another is driven by individual consumer characteristics or by actions of other consumers in her social network. Such consumption interdependences are estimated using a unique dataset, which contains transaction data based on anonymized call records from a large European mobile phone carrier to approximate a consumer's social network. Results show that network effects have an important impact on consumers' switching decisions: switching decisions are interdependent between consumers who interact with each other and this interdependence increases in the closeness between two consumers as measured by the calling data. In other words, if a subscriber switches carriers, she is also affecting the switching probabilities of other individuals in her social circle. The paper argues that such an approach is of high relevance to both switching of providers and to the adoption of new products. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
The present paper offers a methodological approach towards the estimation and definition of enthalpies constituting an energy balance around a fast pyrolysis experiment conducted in a laboratory scale fluid bed with a capacity of 1 kg/ h. Pure N2 was used as fluidization medium at atmospheric pressure and the operating temperature (∼500°C) was adjusted with electrical resistors. The biomass feedstock type that was used was beech wood. An effort was made to achieve a satisfying 92.5% retrieval of products (dry basis mass balance) with the differences mainly attributed to loss of some bio-oil constituents into the quenching medium, ISOPAR™. The chemical enthalpy recovery for bio-oil, char and permanent gases is calculated 64.6%, 14.5% and 7.1%, respectively. All the energy losses from the experimental unit into the environment, namely the pyrolyser, cooling unit etc. are discussed and compared to the heat of fast pyrolysis that was calculated at 1123.5 kJ per kg of beech wood. This only represents 2.4% of the biomass total enthalpy or 6.5% its HHV basis. For the estimation of some important thermo-physical properties such as heat capacity and density, it was found that using data based on the identified compounds from the GC/MS analysis is very close to the reference values despite the small fraction of the bio-oil components detected. The methodology and results can help as a starting point for the proper design of fast pyrolysis experiments, pilot and/or industrial scale plants.
Resumo:
A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.