922 resultados para Capital market -- United States -- Data processing
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
The authors examined avoidance personal goals as concurrent (Study 1) and longitudinal (Study 2) predictors of multiple aspects of well-being in the United States and Japan. In both studies, participants adopted more avoidance personal goals in Japan relative to the United States. Both studies also demonstrated that avoidance personal goals were significant negative predictors of the most relevant aspects of well-being in each culture. Specifically, avoidance personal goals were negative predictors of intrapersonal and eudaimonic well-being in the United States and were negative predictors of interpersonal and eudaimonic well-being in Japan. The findings clarify and extend puzzling findings from prior empirical work in this area, and raise provocative possibilities about the nature of avoidance goal pursuit.
Resumo:
Chester Crocker was appointed as Reagan's Assistant Secretary of State for African Affairs in 1981. He had criticised the inconsistencies of US African policy and proposed a renewed emphasis on the balance between America's global interests with specific regional priorities. While the focus of Congressmen, journalists and public opinion centred on the issue of apartheid, it was the Namibian War of Independence (South African Border War) that initially drew the attention of the Reagan administration, and it was the resolution of this war that remained the priority for the US government in this region throughout Crocker's time in office.
Resumo:
The purpose of this work is to verify the stability of the relationship between real activity and interest rate spread. The test is based on Chen (1988) and Osorio and Galea (2006). The analysis is applied to Chile and the United States, from 1980 to 1999. In general, in both cases the relationship was statistically significant in early 80s, but a break point is found in both countries during that decades, suggesting that the relationship depends on the monetary rule follow by the Central Bank.
Resumo:
A student from the Data Processing program at the New York Trade School is shown working. Black and white photograph with some edge damage due to writing in black along the top.
Resumo:
Felice Gigante a graduate from the New York Trade School Electronics program works on a machine in his job as Data Processing Customer Engineer for the International Business Machines Corp. Original caption reads, "Felice Gigante - Electronices, International Business Machines Corp." Black and white photograph with caption glued to reverse.
Resumo:
Written about the time of the Golden Venture incident, Chang-rae Lee’s Native Speaker makes a particular reference to that incident, whereby implying that particular immigrants, on the grounds of their racial identities, are mistreated and considered as aliens by some Americas. While some whites discriminate against immigrants, there is widespread ethnic tension between Korean Americans and African Americans. Significantly, racial conflict between Koreans and blacks and the racist attitude of some whites toward immigrants are mirrored in the relationship between the Korean-American protagonist Henry and his American wife Lelia. That is, due to their different racial identities they do not understand each other and they always argue. However, toward the end of the novel, Henry and Lelia come to understand each other. While ethnic conflict between Koreans and blacks and certain whites’ discriminatory attitudes toward immigrants is serious one, the novel suggests the unimportance of racial identity. In other words, the novel concludes that there is no discriminatory treatment of immigrants and, in fact, every one is a native Speaker in America. In the novel there is no message of how racial conflict could be resolved. However, this essay suggests that by investigating how the tension between Henry and Lelia is resolved, one could suggest a solution for the ethnicity problem in America and in real life.
Resumo:
GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.
Resumo:
This paper addresses the feasibility of implementing Japanese manufacturing systems in the United States. The recent success of Japanese transplant companies suggests that Just-In-Time (JIT) production is possible within America's industrial environment. Once American workers receive proper training, they have little difficulty participating in rapid setup procedures and utilizing the kanban system. Japanese transplants are gradually developing Japanese-style relationships with their American supplier companies by initiating long-term, mutually beneficial agreements. They are also finding ways to cope with America's problem of distance, which is steadily decreasing as an obstacle to JIT delivery. American companies, however, encounter Significant problems in trying to convert traditionally organized, factories to the JIT system. This paper demonstrates that it is both feasible and beneficial for American manufacturers to implement JIT production techniques. Many of the difficulties manufacturers experience center around a general lack of information about JIT. Once a company realizes its potential for setup-time reduction, a prerequisite for the JIT system, workers and managers can work together to create a new process for handling equipment changeover. Significant results are possible with minimal investment. Also, supervisors often do not realize that the JIT method of ordering goods from suppliers is compatible with current systems. This "kanban system" not only enhances current systems but also reduces the amount of paperwork and scheduling involved. When arranging JlT delivery of supplier goods, American manufacturers tend to overlook important aspects of JIT supplier management. However, by making long-tenn commitments, initiating the open exchange of information, assisting suppliers in reaching new standards of performance, increasing the level of conununication, and relying more on suppliers' engineering capabilities, even American manufacturers can develop Japanese-style supplier relationships that enhance the effectiveness of the system.