864 resultados para LARGE NUMBERS
Resumo:
Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.
Resumo:
This study examined the motivating factors for perpetrators of antigay harassment and violence among 752 college freshmen. Large numbers of lesbians, gay men and bisexuals (LGB) are victimized solely because of their sexual orientation. The physical and psychological harm suffered by many of these individuals is alarming. In particular, victimization at school is correlated with a variety of other health risks for LGB students. In order for prevention efforts to be effectively tailored, it may be helpful for researchers to first identify what motivates the assailants. This study tested variables capturing demographic, psychosocial, and attitudinal factors. This purposive sample was selected because these students represent the age group most likely to become perpetrators. The findings suggest that harassment of gay people is common and, in many cases, not motivated by particularly negative attitudes toward homosexuals. Instead, LGB individuals may be viewed as a socially acceptable target by others to harass out of boredom, anger at someone else, or in an attempt to assert their own threatened heterosexuality. Social norms, along with the variety and weakness of individual predictors for antigay harassment, further suggest that heterosexism is endemic and pervasive in our society. Physical attacks against homosexuals, although less common, represent a more serious problem for the victims. This study discovered that there were some leading predictors for these assaults, namely, being male, having been maltreated, being a heavy social drinker, and having defensive, antigay attitudes. The implications of these findings and imperatives for social workers are discussed.
Resumo:
In China in particular, large, planned special events (e.g., the Olympic Games, etc.) are viewed as great opportunities for economic development. Large numbers of visitors from other countries and provinces may be expected to attend such events, bringing in significant tourism dollars. However, as a direct result of such events, the transportation system is likely to face great challenges as travel demand increases beyond its original design capacity. Special events in central business districts (CBD) in particular will further exacerbate traffic congestion on surrounding freeway segments near event locations. To manage the transportation system, it is necessary to plan and prepare for such special events, which requires prediction of traffic conditions during the events. This dissertation presents a set of novel prototype models to forecast traffic volumes along freeway segments during special events. Almost all research to date has focused solely on traffic management techniques under special event conditions. These studies, at most, provided a qualitative analysis and there was a lack of an easy-to-implement method for quantitative analyses. This dissertation presents a systematic approach, based separately on univariate time series model with intervention analysis and multivariate time series model with intervention analysis for forecasting traffic volumes on freeway segments near an event location. A case study was carried out, which involved analyzing and modelling the historical time series data collected from loop-detector traffic monitoring stations on the Second and Third Ring Roads near Beijing Workers Stadium. The proposed time series models, with expected intervention, are found to provide reasonably accurate forecasts of traffic pattern changes efficiently. They may be used to support transportation planning and management for special events.
Resumo:
Large numbers of colonially nesting herons, egrets, ibises, storks and spoonbills were one of the defining natural phenomena of the historical Everglades. Reproduction of these species has been tracked over at least a century, and some clear responses to dramatic anthropogenic hydrological alterations have been established. These include a marked decline in nesting populations of several species, and a movement of colonies away from the over-drained estuarine region. Ponding in a large portion of the freshwater marsh has favored species that hunt by sight in deep water (egrets, cf. 25–45 cm), while tactile feeders (ibises and storks) that depend on concentrated prey in shallow water (5–25 cm) have become proportionately much less common. There has been a marked increase in the interval between exceptionally large breeding aggregations of White Ibises (Eudocimus albus). Loss of short hydroperiod wetlands on the margins of the Everglades have delayed nest initiations 1–2 months by Wood Storks (Mycteria americana) resulting in poor nesting success. These responses are consistent with mechanisms that involve foraging, and the availability and production of prey animals, and each of the relationships is highly dependent on hydrology. Here, we define a group of characteristics about wading bird dynamics (= indicators) that collectively track the specific ecological relationships that supported ibises and storks in the past. We suggest four metrics as indicators of restoration success: timing of nesting by storks, the ratio of nesting ibises + storks to Great Egrets, the proportion of all nests located in the estuarine/freshwater ecotone, and the interval between years with exceptionally large ibis nestings. Each of these metrics has historical (e.g., predrainage) data upon which to base expectations for restoration, and the metrics have little measurement error relative to the large annual variation in numbers of nests. In addition to the strong scientific basis for the use of these indicators, wading birds are also a powerful tool for public communication because they have strong aesthetic appeal, and their ecological relationships with water are intuitively understandable. In the interests of communicating with the public and decision-makers, we integrate these metrics into a single-page annual “traffic-light” report card for wading bird responses. Collectively, we believe these metrics offer an excellent chance of detecting restoration of the ecosystem functions that supported historical wading bird nesting patterns.
Resumo:
Management retention in the school foodservice industry has been a growing concern for school district decision makers due to the large numbers of managers reaching retirement age and the shortage of qualified people to fill the positions. As with other foodservice positions, turnover rates and the shortage of service employees will continue to be challenges well into the 21st centery. The current study employed by a self-administered questionnaire and asked 101 school foodservice managers in Central Florida to rate their perceived importance of and their perceived experience with 20 employment characteristics of their job. There were significant differences in 17 of the 20 characteristics thus highlighting significant gaps between perceived importance and perceived actual experience on the job and what would keep them from changing jobs. Management and human resources implications are discussed.
Hospitality Graduate Students’ Program Choice Decisions: Implications for Faculty and Administrators
Resumo:
Despite rapid growth in the quality and volume of hospitality graduate research and education in recent years, little information is available in the extant body of literature about the program choices of hospitality management graduate students, information that is crucial for program administrators and faculty in their attempts to attract the most promising students to their programs. This paper reports on a study among graduate students in U.S, hospitality management programs designed to understand why they chose to pursue their degrees at their programs of choice. Given the large numbers of international students presently enrolled, the study additionally looked into why international hospitality management students chose to leave their home countries and why they decided to pursue a graduate degree in the U.S. Based on the findings, implications for hospitality administrators and faculty in the U.S. and abroad are discussed and directions for future research are presented.
Resumo:
In China in particular, large, planned special events (e.g., the Olympic Games, etc.) are viewed as great opportunities for economic development. Large numbers of visitors from other countries and provinces may be expected to attend such events, bringing in significant tourism dollars. However, as a direct result of such events, the transportation system is likely to face great challenges as travel demand increases beyond its original design capacity. Special events in central business districts (CBD) in particular will further exacerbate traffic congestion on surrounding freeway segments near event locations. To manage the transportation system, it is necessary to plan and prepare for such special events, which requires prediction of traffic conditions during the events. This dissertation presents a set of novel prototype models to forecast traffic volumes along freeway segments during special events. Almost all research to date has focused solely on traffic management techniques under special event conditions. These studies, at most, provided a qualitative analysis and there was a lack of an easy-to-implement method for quantitative analyses. This dissertation presents a systematic approach, based separately on univariate time series model with intervention analysis and multivariate time series model with intervention analysis for forecasting traffic volumes on freeway segments near an event location. A case study was carried out, which involved analyzing and modelling the historical time series data collected from loop-detector traffic monitoring stations on the Second and Third Ring Roads near Beijing Workers Stadium. The proposed time series models, with expected intervention, are found to provide reasonably accurate forecasts of traffic pattern changes efficiently. They may be used to support transportation planning and management for special events.
Resumo:
Melanoma is one of the most aggressive types of cancer. It originates from the transformation of melanocytes present in the epidermal/dermal junction of the human skin. It is commonly accepted that melanomagenesis is influenced by the interaction of environmental factors, genetic factors, as well as tumor-host interactions. DNA photoproducts induced by UV radiation are, in normal cells, repaired by the nucleotide excision repair (NER) pathway. The prominent role of NER in cancer resistance is well exemplified by patients with Xeroderma Pigmentosum (XP). This disease results from mutations in the components of the NER pathway, such as XPA and XPC proteins. In humans, NER pathway disruption leads to the development of skin cancers, including melanoma. Similar to humans afflicted with XP, Xpa and Xpc deficient mice show high sensibility to UV light, leading to skin cancer development, except melanoma. The Endothelin 3 (Edn3) signaling pathway is essential for proliferation, survival and migration of melanocyte precursor cells. Excessive production of Edn3 leads to the accumulation of large numbers of melanocytes in the mouse skin, where they are not normally found. In humans, Edn3 signaling pathway has also been implicated in melanoma progression and its metastatic potential. The goal of this study was the development of the first UV-induced melanoma mouse model dependent on the over-expression of Edn3 in the skin. The UV-induced melanoma mouse model reported here is distinguishable from all previous published models by two features: melanocytes are not transformed a priori and melanomagenesis arises only upon neonatal UV exposure. In this model, melanomagenesis depends on the presence of Edn3 in the skin. Disruption of the NER pathway due to the lack of Xpa or Xpc proteins was not essential for melanomagenesis; however, it enhanced melanoma penetrance and decreased melanoma latency after one single neonatal erythemal UV dose. Exposure to a second dose of UV at six weeks of age did not change time of appearance or penetrance of melanomas in this mouse model. Thus, a combination of neonatal UV exposure with excessive Edn3 in the tumor microenvironment is sufficient for melanomagenesis in mice; furthermore, NER deficiency exacerbates this process.^
Resumo:
In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. ^ Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. ^ In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data. ^
Resumo:
The purpose of this research paper is to follow a line of ongoing investigations that discuss dates for the origin of the synoptic gospels and evaluate the arguments for early, late, and intermediate dating and their susceptibility to critique from opposing arguments. There are three principal components in dating theories: (1) data from the Greek in the earliest texts (2) data concerning the provenance of the earliest texts (3) and data from the historical context of the first century. The study is significant because, contrary to what might be expected, the starting and key point in deciding on a composition date is the Book of Acts of the Apostles. This study compiled and integrated information, in an unbiased fashion, based on reading and researching large numbers of texts by scholars, such as Hengel, who support an earlier dating, as well as those, such as Fitzmyer, who support a later dating. Furthermore, this study also required knowledge of those scholars who propose dates that do not fall into these main categories. The research demonstrated that by looking at the Book of Acts of the Apostles as the key starting point, the synoptic gospels were most likely composed before 70 CE, therefore, supporting scholars who argue for an earlier date.
Resumo:
In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data.
Resumo:
Background: Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and / or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. Results: The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Conclusions: Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems.
Resumo:
Subspaces and manifolds are two powerful models for high dimensional signals. Subspaces model linear correlation and are a good fit to signals generated by physical systems, such as frontal images of human faces and multiple sources impinging at an antenna array. Manifolds model sources that are not linearly correlated, but where signals are determined by a small number of parameters. Examples are images of human faces under different poses or expressions, and handwritten digits with varying styles. However, there will always be some degree of model mismatch between the subspace or manifold model and the true statistics of the source. This dissertation exploits subspace and manifold models as prior information in various signal processing and machine learning tasks.
A near-low-rank Gaussian mixture model measures proximity to a union of linear or affine subspaces. This simple model can effectively capture the signal distribution when each class is near a subspace. This dissertation studies how the pairwise geometry between these subspaces affects classification performance. When model mismatch is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the model mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. This linear transformation, termed TRAIT, also preserves some specific features in each class, being complementary to a recently developed Low Rank Transform (LRT). Moreover, when the model mismatch is more significant, TRAIT shows superior performance compared to LRT.
The manifold model enforces a constraint on the freedom of data variation. Learning features that are robust to data variation is very important, especially when the size of the training set is small. A learning machine with large numbers of parameters, e.g., deep neural network, can well describe a very complicated data distribution. However, it is also more likely to be sensitive to small perturbations of the data, and to suffer from suffer from degraded performance when generalizing to unseen (test) data.
From the perspective of complexity of function classes, such a learning machine has a huge capacity (complexity), which tends to overfit. The manifold model provides us with a way of regularizing the learning machine, so as to reduce the generalization error, therefore mitigate overfiting. Two different overfiting-preventing approaches are proposed, one from the perspective of data variation, the other from capacity/complexity control. In the first approach, the learning machine is encouraged to make decisions that vary smoothly for data points in local neighborhoods on the manifold. In the second approach, a graph adjacency matrix is derived for the manifold, and the learned features are encouraged to be aligned with the principal components of this adjacency matrix. Experimental results on benchmark datasets are demonstrated, showing an obvious advantage of the proposed approaches when the training set is small.
Stochastic optimization makes it possible to track a slowly varying subspace underlying streaming data. By approximating local neighborhoods using affine subspaces, a slowly varying manifold can be efficiently tracked as well, even with corrupted and noisy data. The more the local neighborhoods, the better the approximation, but the higher the computational complexity. A multiscale approximation scheme is proposed, where the local approximating subspaces are organized in a tree structure. Splitting and merging of the tree nodes then allows efficient control of the number of neighbourhoods. Deviation (of each datum) from the learned model is estimated, yielding a series of statistics for anomaly detection. This framework extends the classical {\em changepoint detection} technique, which only works for one dimensional signals. Simulations and experiments highlight the robustness and efficacy of the proposed approach in detecting an abrupt change in an otherwise slowly varying low-dimensional manifold.
Resumo:
Melanoma is one of the most aggressive types of cancer. It originates from the transformation of melanocytes present in the epidermal/dermal junction of the human skin. It is commonly accepted that melanomagenesis is influenced by the interaction of environmental factors, genetic factors, as well as tumor-host interactions. DNA photoproducts induced by UV radiation are, in normal cells, repaired by the nucleotide excision repair (NER) pathway. The prominent role of NER in cancer resistance is well exemplified by patients with Xeroderma Pigmentosum (XP). This disease results from mutations in the components of the NER pathway, such as XPA and XPC proteins. In humans, NER pathway disruption leads to the development of skin cancers, including melanoma. Similar to humans afflicted with XP, Xpa and Xpc deficient mice show high sensibility to UV light, leading to skin cancer development, except melanoma. The Endothelin 3 (Edn3) signaling pathway is essential for proliferation, survival and migration of melanocyte precursor cells. Excessive production of Edn3 leads to the accumulation of large numbers of melanocytes in the mouse skin, where they are not normally found. In humans, Edn3 signaling pathway has also been implicated in melanoma progression and its metastatic potential. The goal of this study was the development of the first UV-induced melanoma mouse model dependent on the over-expression of Edn3 in the skin. The UV-induced melanoma mouse model reported here is distinguishable from all previous published models by two features: melanocytes are not transformed a priori and melanomagenesis arises only upon neonatal UV exposure. In this model, melanomagenesis depends on the presence of Edn3 in the skin. Disruption of the NER pathway due to the lack of Xpa or Xpc proteins was not essential for melanomagenesis; however, it enhanced melanoma penetrance and decreased melanoma latency after one single neonatal erythemal UV dose. Exposure to a second dose of UV at six weeks of age did not change time of appearance or penetrance of melanomas in this mouse model. Thus, a combination of neonatal UV exposure with excessive Edn3 in the tumor microenvironment is sufficient for melanomagenesis in mice; furthermore, NER deficiency exacerbates this process.
Resumo:
Large numbers of calcareous dinoflagellate cysts and the vegetative calcareous coccoid species Thoracosphaera heimii are generally found in sediments underlying oligotrophic and/or stratified (sub)surface water environments. It is difficult to distinguish between the relative importance of these two environmental parameters on calcareous cyst and T. heimii distribution as they usually covary, but this information is essential if we want to apply cysts properly in the reconstruction of palaeoenvironments and past surface water hydrography. In the multi-proxy core GeoB 1523-1 from the Ceará Rise region in the western equatorial Atlantic Ocean (covering the past 155 ka), periods of greatest oligotrophy are not synchronous with periods of greatest stratification (Rühlemann et al., 1996, doi:10.1016/S0025-3227(96)00048-5; Mulitza et al., 1997, doi:10.1130/0091-7613(1997)025<0335:PFAROP>2.3.CO;2; 335-338; Mulitza et al., 1998, doi:10.1016/S0012-821X(98)00012-0), giving us the unique opportunity to differentiate between the effects of both parameters on cyst accumulation. The calcareous cyst record of the core reflects prominent increases in accumulation rate of nearly all observed species only during the nutrient-enriched but more stratified isotopic (sub)stages 5.5, 5.3, 5.1 and 1. In this respect, the distribution trends in the core are more similar to those of the eastern equatorial upwelling region (GeoB 1105-4) than they are to those of the oligotrophic north-eastern Brazilian continental slope (GeoB 2204-2), even though temporal changes in bioproductivity are principally in antiphase between the eastern and western equatorial regions. We conclude that stratification of the upper water column and the presence of a well-developed thermocline are probably the more important factors controlling cyst distribution in the equatorial Atlantic, whereas the state of oligotrophy secondarily influences cyst production within a well-stratified environment.