15 resultados para Privacy By Design, Data Protection Officer, Privacy Officer, trattamento, dati personali, PETs
em Aston University Research Archive
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Templated, macroporous Mg-Al hydrotalcites synthesised via alkali-free co-precipitation exhibit superior performance in the transesterification of C4 -C18 triglycerides for biodiesel production, with rate-enhancement increasing with alkyl chain length. Promotion reflects improved diffusion of bulky triglycerides and accessibility of active sites within the hierarchical macropore-micropore architecture. © 2012 The Royal Society of Chemistry.
Resumo:
The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.
Resumo:
This paper proposes a new converter protection method, primarily based on a series dynamic resistor (SDR) that avoids the doubly-fed induction generator (DFIG) control being disabled by crowbar protection during fault conditions. A combined converter protection scheme based on the proposed SDR and conventional crowbar is analyzed and discussed. The main protection advantages are due to the series topology when compared with crowbar and dc-chopper protection. Various fault overcurrent conditions (both symmetrical and asymmetrical) are analyzed and used to design the protection in detail, including the switching strategy and coordination with crowbar, and resistance value calculations. PSCAD/EMTDC simulation results show that the proposed method is advantageous for fault overcurrent protection, especially for asymmetrical faults, in which the traditional crowbar protection may malfunction.
Resumo:
This PhD thesis belongs to three main knowledge domains: operations management, environmental management, and decision making. Having the automotive industry as the key sector, the investigation was undertaken aiming at deepening the understanding of environmental decision making processes in the operations function. The central research question for this thesis is ?Why and how do manufacturing companies take environmental decisions? This PhD research project used a case study research strategy supplemented by secondary data analysis and the testing and evaluation of a proposed systems thinking model for environmental decision making. Interviews and focus groups were the main methods for data collection. The findings of the thesis show that companies that want to be in the environmental leadership will need to take environmental decisions beyond manufacturing processes. Because the benefits (including financial gain) of non-manufacturing activities are not clear yet the decisions related to product design, supply chain and facilities are fully embedded with complexity, subjectivism, and intrinsic risk. Nevertheless, this is the challenge environmental leaders will face - they may enter in a paradoxical state of their decisions – where although the risk of going greener is high, the risk of not doing it is even higher.
Resumo:
This work describes how the physical properties of a solvent affect the design variables of a physical gas absorption process. The role of every property in determining the capital and the running cost of a process has been specified. Direct mathematical relationships have been formulated between every item of capital or running cost and the properties which are related to that item. The accuracy of the equations formulated has been checked by comparing their outcome with some actual design data. A good agreement has been found. The equations formulated may be used to evaluate on the basis of economics any suggested new solvents. A group of solvents were selected for evaluation. Their physical properties were estimated or collected as experimental data. The selected ones include three important solvents, the first is polyethylene glycol dimethyl ether (Selexol) which represents the currently most successful one, The other two solvents are acetonyl acetone (B2) and n-formyl morpholine which have been suggested previously as potential credible alternatives to the current ones. The important characteristics of: acetonyl acetone are its high solubility and its low viscosity, while the n-formyl morpholine is characterised by its low vapour pressure and its high selectivity. It was found that acetonyl acetone (B2) is the most attractive solvent for commercial applications particularly for process configurations that:include heat exchangers and strippers. The effect of the process configuration on the selected solvent was investigated in detail and it was found that there is no universal solvent which is the best for any process configuration, but that there is a best solvent for a given process configuration. In previous work, acetonyl acetone was suggested as a commercially promising physical solvent. That suggestion was not fully based on experimental measurement of all the physical properties. The viscosity of acetonyl acetone and its solubility at 1 atm were measured but the vapour pressure and the solubility of C02 and CH4 at high pressure were predicted. In this work, the solubilities of C02, CH4 and C3H8 in acetenyl acetone were measured for a partial pressure range of (2 ~ 22) bar at 25°C, The vapour pressure of this solvent was also measured, and the Antoine equation was formulated from tbe experimental data. The experimental data were found to be not In agreement with the predicted ones, so acetonyl acetone was re-evaluated according to the experimental data. It was found that this solvent can be recommended for further trials in a pilot plant study or for small scale commercial units.
Resumo:
A fluidized bed process development unit of 0.8 m internal diameter was designed on basis of results obtained from a bench scale laboratory unit. For the scaling up empirical models from the literature were used. The process development unit and peripheral equipment were constructed, assembled and commissioned, and instruments were provided for data acquisition. The fluidization characteristics of the reactor were determined and were compared to the design data. An experimental programme was then carried out and mass and energy balances were made for all the runs. The results showed that the most important independent experimental parameter was the air factor, with an optimum at 0.3. The optimum higher heating value of the gas produced was 6.5 MJ/Nm3, while the thermal efficiency was 70%. Reasonably good agreement was found between the experimental results, theoretical results from a thermodynamic model and data from the literature. It was found that the attainment of steady state was very sensitive to a continuous and constant feedstock flowrate, since the slightest variation in feed flow resulted in fluctuations of the gas quality. On the basis of the results a set of empirical relationships was developed, which constitutes an empirical model for the prediction of the performance of fluidized bed gasifiers. This empirical model was supplemented by a design procedure by which fluidized bed gasifiers can be designed and constructed. The design procedure was then extended to cover feedstock feeding and gas cleaning in a conceptual design of a fluidized bed gasification facility. The conceptual design was finally used to perform an economic evaluation of a proposed gasification facility. The economics of this plant (retrofit application) were favourable.
Resumo:
The role of the production system as a key determinant of competitive performance of business operations- has long been the subject of industrial organization research, even predating the .explicit conceptua1isation of manufacturing, strategy in the literature. Particular emergent production issues such as the globalisation of production, global supply chain management, management of integrated manufacturing and a growing e~busjness environment are expected to critically influence the overall competitive performance and therefore the strategic success of the organization. More than ever, there is a critical need to configure and improve production system and operations competence in a strategic way so as to contribute to the long-term competitiveness of the organization. In order to operate competitively and profitably, manufacturing companies, no matter how well managed, all need a long-term 'strategic direction' for the development of operations competence in order to consistently produce more market value with less cost towards a leadership position. As to the long-term competitiveness, it is more important to establish a dynamic 'strategic perspective' for continuous operational improvements in pursuit of this direction, as well as ongoing reviews of the direction in relation to the overall operating context. However, it also clear that the 'existing paradigm of manufacturing strategy development' is incapable of adequately responding to the increasing complexities and variations of contemporary business operations. This has been factually reflected as many manufacturing companies are finding that methodologies advocated in the existing paradigm for developing manufacturing strategy have very limited scale and scope for contextual contingency in empirical application. More importantly, there has also emerged a deficiency in the multidimensional and integrative profile from a theoretical perspective when operationalising the underlying concept of strategic manufacturing management established in the literature. The point of departure for this study was a recognition of such contextual and unitary limitations in the existing paradigm of manufacturing strategy development when applied to contemporary industrial organizations in general, and Chinese State Owned Enterprises (SOEs) in particular. As China gradually becomes integrated into the world economy, the relevance of Western management theory and its paradigm becomes a practical matter as much as a theoretical issue. Since China markedly differs from Western countries in terms of culture, society, and political and economic systems, it presents promising grounds to test and refine existing management theories and paradigms with greater contextual contingency and wider theoretical perspective. Under China's ongoing programmes of SOE reform, there has been an increased recognition that strategy development is the very essence of the management task for managers of manufacturing companies in the same way as it is for their counterparts in Western economies. However, the Western paradigm often displays a rather naive and unitary perspective of the nature of strategic management decision-making, one which largely overlooks context-embedded factors and social/political influences on the development of manufacturing strategy. This thesis studies the successful experiences of developing manufacturing strategy from five high-performing large-scale SOEs within China’s petrochemical industry. China’s petrochemical industry constitutes a basic heavy industrial sector, which has always been a strategic focus for reform and development by the Chinese government. Using a confirmation approach, the study has focused on exploring and conceptualising the empirical paradigm of manufacturing strategy development practiced by management. That is examining the ‘empirical specifics’ and surfacing the ‘managerial perceptions’ of content configuration, context of consideration, and process organization for developing a manufacturing strategy during the practice. The research investigation adopts a qualitative exploratory case study methodology with a semi-structural front-end research design. Data collection follows a longitudinal and multiple-case design and triangulates case evidence from sources including qualitative interviews, direct observation, and a search of documentations and archival records. Data analysis follows an investigative progression from a within-case preliminary interpretation of facts to a cross-case search for patterns through theoretical comparison and analytical generalization. The underlying conceptions in both the literature of manufacturing strategy and related studies in business strategy were used to develop theoretical framework and analytical templates applied during data collection and analysis. The thesis makes both empirical and theoretical contributions to our understanding of 'contemporary management paradigm of manufacturing strategy development'. First, it provides a valuable contextual contingency of the 'subject' using the business setting of China's SOEs in petrochemical industry. This has been unpacked into empirical configurations developed for its context of consideration, its content and process respectively. Of special note, a lean paradigm of business operations and production management discovered at case companies has significant implications as an emerging alternative for high-volume capital intensive state manufacturing in China. Second, it provides a multidimensional and integrative theoretical profile of the 'subject' based upon managerial perspectives conceptualised at case companies when operationalising manufacturing strategy. This has been unpacked into conceptual frameworks developed for its context of consideration, its content constructs, and its process patterns respectively. Notably, a synergies perspective towards the operating context, competitive priorities and competence development of business operations and production management has significant implications for implementing a lean manufacturing paradigm. As a whole, in so doing, the thesis established a theoretical platform for future refinement and development of context-specific methodologies for developing manufacturing strategy.
Resumo:
The Securities and Exchange Commission (SEC) in the United States mandated a new digital reporting system for US companies in late 2008. The new generation of information provision has been dubbed by Chairman Cox, ‘interactive data’ (SEC, 2006a). Despite the promise of its name, we find that in the development of the project retail investors are invoked as calculative actors rather than engaged in dialogue. Similarly, the potential for the underlying technology to be applied in ways to encourage new forms of accountability appears to be forfeited in the interests of enrolling company filers. We theorise the activities of the SEC and in particular its chairman at the time, Christopher Cox, over a three year period, both prior to and following the ‘credit crisis’. We argue that individuals and institutions play a central role in advancing the socio-technical project that is constituted by interactive data. We adopt insights from ANT (Callon, 1986; Latour, 1987, 2005b) and governmentality (Miller, 2008; Miller and Rose, 2008) to show how regulators and the proponents of the technology have acted as spokespersons for the interactive data technology and the retail investor. We examine the way in which calculative accountability has been privileged in the SEC’s construction of the retail investor as concerned with atomised, quantitative data (Kamuf, 2007; Roberts, 2009; Tsoukas, 1997). We find that the possibilities for the democratising effects of digital information on the Internet has not been realised in the interactive data project and that it contains risks for the very investors the SEC claims to seek to protect.
Resumo:
Biological experiments often produce enormous amount of data, which are usually analyzed by data clustering. Cluster analysis refers to statistical methods that are used to assign data with similar properties into several smaller, more meaningful groups. Two commonly used clustering techniques are introduced in the following section: principal component analysis (PCA) and hierarchical clustering. PCA calculates the variance between variables and groups them into a few uncorrelated groups or principal components (PCs) that are orthogonal to each other. Hierarchical clustering is carried out by separating data into many clusters and merging similar clusters together. Here, we use an example of human leukocyte antigen (HLA) supertype classification to demonstrate the usage of the two methods. Two programs, Generating Optimal Linear Partial Least Square Estimations (GOLPE) and Sybyl, are used for PCA and hierarchical clustering, respectively. However, the reader should bear in mind that the methods have been incorporated into other software as well, such as SIMCA, statistiXL, and R.
Resumo:
Recent policy changes in the UK encourage at-risk communities to learn to live with and adapt to flooding. Adaptation of individual properties by embracing resilient and resistant measures is an important aspect endorsed therein. Uptake of such protection measures by property owners, including that of Small and Medium-sized Enterprises (SMEs), has traditionally been low. A post-flood situation offers an opportunity to reinstate / reconstruct by integrating flood protection measures, in such a way that reduce damage and enhance the ability to recover in the event of a future flood incidence. In order to investigate the reinstatement / reconstruction experiences of flood affected SMEs, those affected by the 2009 Cockermouth flood event were studied. The results of a questionnaire survey revealed that many SMEs have opted for traditional reinstatement rather than resilient reinstatement. A detailed case study revealed requirements of getting the business back and running as soon as possible, a lack of guidance and advice from professionals and financial concerns as some of the barriers faced by SMEs. It is important that SMEs are provided with necessary guidance during the post-flood reinstatement stage, in order to make sure that the opportunity to build back better, integrating flood-protection measures is grasped by the SME owners. Stakeholders related to the construction industry, who are actively involved with post-flood reinstatement work, have an important role to play in this regard, providing necessary guidance and expertise to flooded SMEs.
Resumo:
The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.
The Long-Term impact of Business Support? - Exploring the Role of Evaluation Timing using Micro Data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
The long-term impact of business support? - Exploring the role of evaluation timing using micro data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.