20 resultados para Business enterprises -- Electronic data processing -- Study and teaching (Higher) -- Chile
em CentAUR: Central Archive University of Reading - UK
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
One of the major differences undergraduates experience during the transition to university is the style of teaching. In schools and colleges most students study key stage 5 subjects in relatively small informal groups where teacher–pupil interaction is encouraged and two-way feedback occurs through question and answer type delivery. On starting in HE students are amazed by the sizes of the classes. For even a relatively small chemistry department with an intake of 60-70 students, biologists, pharmacists, and other first year undergraduates requiring chemistry can boost numbers in the lecture hall to around 200 or higher. In many universities class sizes of 400 are not unusual for first year groups where efficiency is crucial. Clearly the personalised classroom-style delivery is not practical and it is a brave student who shows his ignorance by venturing to ask a question in front of such an audience. In these environments learning can be a very passive process, the lecture acts as a vehicle for the conveyance of information and our students are expected to reinforce their understanding by ‘self-study’, a term, the meaning of which, many struggle to understand. The use of electronic voting systems (EVS) in such situations can vastly change the students’ learning experience from a passive to a highly interactive process. This principle has already been demonstrated in Physics, most notably in the work of Bates and colleagues at Edinburgh.1 These small hand-held devices, similar to those which have become familiar through programmes such as ‘Who Wants to be a Millionaire’ can be used to provide instant feedback to students and teachers alike. Advances in technology now allow them to be used in a range of more sophisticated settings and comprehensive guides on use have been developed for even the most techno-phobic staff.
Resumo:
Chongqing is the largest central-government-controlled municipality in China, which is now under going a rapid urbanization. The question remains open: What are the consequences of such rapid urbanization in Chongqing in terms of urban microclimates? An integrated study comprising three different research approaches is adopted in the present paper. By analyzing the observed annual climate data, an average rising trend of 0.10◦C/decade was found for the annual mean temperature from 1951 to 2010 in Chongqing,indicating a higher degree of urban warming in Chongqing. In addition, two complementary types of field measurements were conducted: fixed weather stations and mobile transverse measurement. Numerical simulations using a house-developed program are able to predict the urban air temperature in Chongqing.The urban heat island intensity in Chongqing is stronger in summer compared to autumn and winter.The maximum urban heat island intensity occurs at around midnight, and can be as high as 2.5◦C. In the day time, an urban cool island exists. Local greenery has a great impact on the local thermal environment.Urban green spaces can reduce urban air temperature and therefore mitigate the urban heat island. The cooling effect of an urban river is limited in Chongqing, as both sides of the river are the most developed areas, but the relative humidity is much higher near the river compared with the places far from it.
Resumo:
With the increase in e-commerce and the digitisation of design data and information,the construction sector has become reliant upon IT infrastructure and systems. The design and production process is more complex, more interconnected, and reliant upon greater information mobility, with seamless exchange of data and information in real time. Construction small and medium-sized enterprises (CSMEs), in particular,the speciality contractors, can effectively utilise cost-effective collaboration-enabling technologies, such as cloud computing, to help in the effective transfer of information and data to improve productivity. The system dynamics (SD) approach offers a perspective and tools to enable a better understanding of the dynamics of complex systems. This research focuses upon system dynamics methodology as a modelling and analysis tool in order to understand and identify the key drivers in the absorption of cloud computing for CSMEs. The aim of this paper is to determine how the use of system dynamics (SD) can improve the management of information flow through collaborative technologies leading to improved productivity. The data supporting the use of system dynamics was obtained through a pilot study consisting of questionnaires and interviews from five CSMEs in the UK house-building sector.
Resumo:
The difference between cirrus emissivities at 8 and 11 μm is sensitive to the mean effective ice crystal size of the cirrus cloud, De. By using single scattering properties of ice crystals shaped as planar polycrystals, diameters of up to about 70 μm can be retrieved, instead of up to 45 μm assuming spheres or hexagonal columns. The method described in this article is used for a global determination of mean effective ice crystal sizes of cirrus clouds from TOVS satellite observations. A sensitivity study of the De retrieval to uncertainties in hypotheses on ice crystal shape, size distributions, and temperature profiles, as well as in vertical and horizontal cloud heterogeneities shows that uncertainties can be as large as 30%. However, the TOVS data set is one of few data sets which provides global and long-term coverage. Having analyzed the years 1987–1991, it was found that measured effective ice crystal diameters De are stable from year to year. For 1990 a global median De of 53.5 μm was determined. Averages distinguishing ocean/land, season, and latitude lie between 23 μm in winter over Northern Hemisphere midlatitude land and 64 μm in the tropics. In general, larger Des are found in regions with higher atmospheric water vapor and for cirrus with a smaller effective emissivity.
A study of students' metacognitive beliefs about foreign language study and their impact on learning
Resumo:
This article reports on an investigation into the language learning beliefs of students of French in England, aged 16 to 18. It focuses on qualitative data from two groups of learners (10 in total). While both groups had broadly similar levels of achievement in French in terns of examination success, they dffered greatly in the self-image they had of themselves as language learners, with one group displaying low levels of self-eficacy beliefs regarding the possibility of future success. The implica tions of such beliefs for students' levels of motivation and persistence are discussed, together with their possible causes. The article concludes by suggesting changes in classroom practice that might help students develop a more positive image of them selves as language learners.
Resumo:
Purpose – The purpose of this paper is to demonstrate key strategic decisions involved in turning around a large multinational operating in a dynamic market. Design/methodology/approach – The paper is based on analysis of archival documents and a semi-structured interview with the chairman of the company credited with its rescue. Findings – Turnaround is complex and involves both planned and emergent strategies. The progress is non-linear requiring adjustment and change in direction of travel. Top management credibility and vision is critical to success. Rescue is only possible if the company has a strong cash generative business among its businesses. The speed of decision making, decisiveness and the ability to implement strategy are among the key ingredients of success. Originality/value – Turnaround is an under-researched area in strategy. This paper contributes to a better understanding in this important area and bridges the gap between theory and practice. It provides a practical view and demonstrates how a leading executive with significant expertise and successful turnaround track record deals with inherent dilemmas of turnaround
Resumo:
Purpose – Mergers and acquisitions are among the most intensely used strategic decisions. Yet research by both academics and consulting groups suggests that many mergers and acquisitions fail to add value. On the other hand there are many companies that successfully use mergers and acquisition to grow and add shareholder value. One such company is WPP. The aim of this paper is to explore why WPP has been successful in its acquisition strategy while so many other companies fail. Design/methodology/approach – The paper draws on documentary evidence and a semi-structured interview with Sir Martin Sorrell – Chief Executive and founder of WPP. Research limitations/implications – The case study offers a unique insight into thinking of a successful acquirer and sheds light on how mergers and acquisitions are managed by WPP. However, because of its design the findings are not generalisable. Originality/value – This case study sheds light on how mergers and acquisitions can be used to create a £9 billion company from a standing start. Furthermore, very few case studies offer insight into the thinking of entrepreneurial Chief Executives who established the business, grew it to become the largest and most profitable marketing services company in the world and engineered close to 300 acquisitions.
Resumo:
Purpose – The focus of extant strategy literature is on for-profit organisations and within these group public organisations. There are other forms of organisations and following the deep recession of 2008 there is greater interest in other forms of organisation. In this case study and interview the aim is to examine strategy, strategic decisions and strategic management of a not-for-profit provident. Design/methodology/approach – The paper draws on documentary evidence and a semi-structured interview with Ray King, chief executive of Bupa. The perspective of CEO is key in strategy and such perspectives are relatively rarer. Findings – Bupa invests its surplus to provide better healthcare. Free from the pressures of quarterly reporting and shareholders it can pursue long-term value creation for members rather than short-term surpluses. Research limitations/implications – The case study and interview offers a unique insight into strategy-making within a successful mutual provident that has grown organically and externally becoming an international leader in health insurance. Originality/value – This case study sheds light on strategy-making within a not-for-profit provident that has diversified and grown significantly over the past six decades. Furthermore, very few case studies offer insight into the thinking of a chief executive who has successfully managed a business in a turbulent environment.