10 resultados para Decisions and criterion

em DRUM (Digital Repository at the University of Maryland)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural language processing has achieved great success in a wide range of ap- plications, producing both commercial language services and open-source language tools. However, most methods take a static or batch approach, assuming that the model has all information it needs and makes a one-time prediction. In this disser- tation, we study dynamic problems where the input comes in a sequence instead of all at once, and the output must be produced while the input is arriving. In these problems, predictions are often made based only on partial information. We see this dynamic setting in many real-time, interactive applications. These problems usually involve a trade-off between the amount of input received (cost) and the quality of the output prediction (accuracy). Therefore, the evaluation considers both objectives (e.g., plotting a Pareto curve). Our goal is to develop a formal understanding of sequential prediction and decision-making problems in natural language processing and to propose efficient solutions. Toward this end, we present meta-algorithms that take an existent batch model and produce a dynamic model to handle sequential inputs and outputs. Webuild our framework upon theories of Markov Decision Process (MDP), which allows learning to trade off competing objectives in a principled way. The main machine learning techniques we use are from imitation learning and reinforcement learning, and we advance current techniques to tackle problems arising in our settings. We evaluate our algorithm on a variety of applications, including dependency parsing, machine translation, and question answering. We show that our approach achieves a better cost-accuracy trade-off than the batch approach and heuristic-based decision- making approaches. We first propose a general framework for cost-sensitive prediction, where dif- ferent parts of the input come at different costs. We formulate a decision-making process that selects pieces of the input sequentially, and the selection is adaptive to each instance. Our approach is evaluated on both standard classification tasks and a structured prediction task (dependency parsing). We show that it achieves similar prediction quality to methods that use all input, while inducing a much smaller cost. Next, we extend the framework to problems where the input is revealed incremen- tally in a fixed order. We study two applications: simultaneous machine translation and quiz bowl (incremental text classification). We discuss challenges in this set- ting and show that adding domain knowledge eases the decision-making problem. A central theme throughout the chapters is an MDP formulation of a challenging problem with sequential input/output and trade-off decisions, accompanied by a learning algorithm that solves the MDP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Travel demand models are important tools used in the analysis of transportation plans, projects, and policies. The modeling results are useful for transportation planners making transportation decisions and for policy makers developing transportation policies. Defining the level of detail (i.e., the number of roads) of the transport network in consistency with the travel demand model’s zone system is crucial to the accuracy of modeling results. However, travel demand modelers have not had tools to determine how much detail is needed in a transport network for a travel demand model. This dissertation seeks to fill this knowledge gap by (1) providing methodology to define an appropriate level of detail for a transport network in a given travel demand model; (2) implementing this methodology in a travel demand model in the Baltimore area; and (3) identifying how this methodology improves the modeling accuracy. All analyses identify the spatial resolution of the transport network has great impacts on the modeling results. For example, when compared to the observed traffic data, a very detailed network underestimates traffic congestion in the Baltimore area, while a network developed by this dissertation provides a more accurate modeling result of the traffic conditions. Through the evaluation of the impacts a new transportation project has on both networks, the differences in their analysis results point out the importance of having an appropriate level of network detail for making improved planning decisions. The results corroborate a suggested guideline concerning the development of a transport network in consistency with the travel demand model’s zone system. To conclude this dissertation, limitations are identified in data sources and methodology, based on which a plan of future studies is laid out.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Scientific studies exploring the environmental and experiential elements that help boost human happiness have become a significant and expanding body of work. Some urban designers, architects and planners are looking to apply this knowledge through policy decisions and design, but there is a great deal of room for further study and exploration. This paper looks at definitions of happiness and happiness measurements used in research. The paper goes on to introduce six environmental factors identified in a literature review that have design implications relating to happiness: Nature, Light, Surprise, Access, Identity, and Sociality. Architectural precedents are examined and design strategies are proposed for each factor, which are then applied to a test case site and building in Baltimore, Maryland. It is anticipated that these factors and strategies will be useful to architects, urban designers and planners as they endeavor to design positive user experiences and set city shaping policy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Beef constitutes a main component of the American diet and still represent the principal source of protein in many parts of the world. Currently, the meat market is experiencing an important transformation; consumers are increasingly switching from consuming traditional beef to grass-fed beef. People recognized products obtained from grass-fed animals as more natural and healthy. However, the true variations between these two production systems regarding various aspects remain unclear. This dissertation provides information from closely genetically related animals, in order to decrease confounding factors, to explain several confused divergences between grain-fed and grass-fed beef. First, we examined the growth curve, important economic traits and quality carcass characteristics over four consecutive years in grain-fed and grass-fed animals, generating valuable information for management decisions and economic evaluation for grass-fed cattle operations. Second, we performed the first integrated transcriptomic and metabolomic analysis in grass-fed beef, detecting alterations in glucose metabolism, divergences in free fatty acids and carnitine conjugated lipid levels, and altered β-oxidation. Results suggest that grass finished beef could possibly benefit consumer health from having lower total fat content and better lipid profile than grain-fed beef. Regarding animal welfare, grass-fed animals may experience less stress than grain-fed individuals as well. Finally, we contrasted the genome-wide DNA methylation of grass-fed beef against grain-fed beef using the methyl-CpG binding domain sequencing (MBD-Seq) method, identifying 60 differentially methylated regions (DMRs). Most of DMRs were located inside or upstream of genes and displayed increased levels of methylation in grass-fed individuals, implying a global DNA methylation increment in this group. Interestingly, chromosome 14, which has been associated with large effects on ADG, marbling, back fat, ribeye area and hot carcass weight in beef cattle, allocated the largest number of DMRs (12/60). The pathway analysis identified skeletal and muscular system as the preeminent physiological system and function, and recognized carbohydrates metabolism, lipid metabolism and tissue morphology among the highest ranked networks. Therefore, although we recognize some limitations and assume that additional examination is still required, this project provides the first integrative genomic, epigenetic and metabolomics characterization of beef produced under grass-fed regimen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A relatively unexplored area of the harpsichord repertoire is the group of transcriptions made by J.S. Bach (1685-1750), Jean Henry d'Anglebert (1629-1691), and Jean-Baptiste Forqueray (1699-1782). These transcriptions are valuable and worth exploring and performing. Studying them provides unique insights into their composer‘s musical thinking. By comparing transcriptions with their original sources, the transcriber's decisions and priorities can be observed. The performance component of this dissertation comprises three recitals. The first features works of Johann Sebastian Bach: two transcriptions of violin concerti by Antonio Vivaldi (1678-1741), and two transcriptions of trio sonatas by Johann Adam Reinken (1643-1722). The most salient feature of Bach‘s transcriptions is his addition of musical material: ornamenting slow movements, adding diminutions and idiomatic keyboard figurations throughout, and recomposing and expanding fugal movements. The second recital features works of Jean Henry d'Anglebert and Jean-Baptiste Forqueray, two French composer/performers. From d'Anglebert‘s many transcriptions, I assembled two key-related suites: the first comprised of lute pieces by Ennemond Gaultier (c. 1575-1651), and the second comprised of movements from operas by Jean-Baptiste Lully (1632-1687). Forqueray's transcriptions are of suites for viola da gamba and continuo, composed by his father, Antoine Forqueray (1671-1745). Creative and varied ornamentation, along with the style brisé of arpeggiated chords, are the most important features of d‘Anglebert‘s transcriptions. Forqueray‘s transcriptions are highly virtuosic and often feature the tenor and bass range of the harpsichord. The third recital features my own transcriptions: the first suite for solo cello by J.S. Bach, excerpts from the opera La Descente d’Orphée aux Enfers by Marc-Antoine Charpentier (1643-1704), and two violin pieces by Nicola Matteis (fl. c. 1670-c. 1698). In these transcriptions, I demonstrate what I have learned from studying and performing the works in the first two recitals. These recitals were performed in the Leah Smith Hall at the University of Maryland on May 4, 2010; May 11, 2010; and October 7, 2010. They were recorded on compact discs and are archived within the Digital Repository at the University of Maryland (DRUM).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By law, Title I schools employ teachers who are both competent in their subject knowledge and State certified. In addition, Title I teachers receive ongoing professional development in technology integration and are equipped with the latest innovative resources to integrate technology in the classroom. The aim is higher academic achievement and the effective use of technology in the classroom. The investment to implement technology in this large urban school district to improve student achievement has continued to increase. In order to infuse current and emerging technology throughout the curriculum, this school district needs to know where teachers have, and have not, integrated technology. Yet the level of how technology is integrated in Title I schools is unknown. This study used the Digital-Age Survey Levels of Teaching Innovation (LoTi) to assess 508 Title I teachers’ technology integration levels using three major initiatives purchased by Title I— the iPads program, the Chromebook initiative, and the interactive whiteboards program. The study used a quantitative approach. Descriptive statistics, regression analysis, and statistical correlations were used to examine the relationship between the level of technology integration and the following dependent variables: personal computer use (PCU), current instructional practices (CIP), and levels of teaching innovation (LoTi). With this information, budgetary decisions and professional development can be tailored to the meet the technology implementation needs of this district. The result of this study determined a significant relationship between the level of teaching innovation, personal computer use, and current instructional practices with teachers who teach with iPad, Chromebook, and/or interactive whiteboard. There was an increase in LoTi, PCU, and CIP scores with increasing years of experience of Title I teachers. There was also a significant relationship between teachers with 20 years or more teaching experience and their LoTi score.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The universities rely on the Information Technology (IT) projects to support and enhance their core strategic objectives of teaching, research, and administration. The researcher’s literature review found that the level of IT funding and resources in the universities is not adequate to meet the IT demands. The universities received more IT project requests than they could execute. As such, universities must selectively fund the IT projects. The objectives of the IT projects in the universities vary. An IT project which benefits the teaching functions may not benefit the administrative functions. As such, the selection of an IT project is challenging in the universities. To aid with the IT decision making, many universities in the United States of America (USA) have formed the IT Governance (ITG) processes. ITG is an IT decision making and accountability framework whose purpose is to align the IT efforts in an organization with its strategic objectives, realize the value of the IT investments, meet the expected performance criteria, and manage the risks and the resources (Weil & Ross, 2004). ITG in the universities is relatively new, and it is not well known how the ITG processes are aiding the nonprofit universities in selecting the right IT projects, and managing the performance of these IT projects. This research adds to the body of knowledge regarding the IT project selection under the governance structure, the maturity of the IT projects, and the IT project performance in the nonprofit universities. The case study research methodology was chosen for this exploratory research. The convenience sampling was done to choose the cases from two large, research universities with decentralized colleges, and two small, centralized universities. The data were collected on nine IT projects from these four universities using the interviews and the university documents. The multi-case analysis was complemented by the Qualitative Comparative Analysis (QCA) to systematically analyze how the IT conditions lead to an outcome. This research found that the IT projects were selected in the centralized universities in a more informed manner. ITG was more authoritative in the small centralized universities; the ITG committees were formed by including the key decision makers, the decision-making roles, and responsibilities were better defined, and the frequency of ITG communication was higher. In the centralized universities, the business units and colleges brought the IT requests to ITG committees; which in turn prioritized the IT requests and allocated the funds and the resources to the IT projects. ITG committee members in the centralized universities had a higher awareness of the university-wide IT needs, and the IT projects tended to align with the strategic objectives. On the other hand, the decentralized colleges and business units in the large universities were influential and often bypassed the ITG processes. The decentralized units often chose the “pet” IT projects, and executed them within a silo, without bringing them to the attention of the ITG committees. While these IT projects met the departmental objectives, they did not always align with the university’s strategic objectives. This research found that the IT project maturity in the university could be increased by following the project management methodologies. The IT project management maturity was found higher in the IT projects executed by the centralized university, where a full-time project manager was assigned to manage the project, and the project manager had a higher expertise in the project management. The IT project executed under the guidance of the Project Management Office (PMO) has exhibited a higher project management maturity, as the PMO set the standards and controls for the project. The IT projects managed by the decentralized colleges by a part-time project manager with lower project management expertise have exhibited a lower project management maturity. The IT projects in the decentralized colleges were often managed by the business, or technical leads, who often lacked the project management expertise. This research found that higher the IT project management maturity, the better is the project performance. The IT projects with a higher maturity had a lower project delay, lower number of missed requirements, and lower number of IT system errors. This research found that the quality of IT decision in the university could be improved by centralizing the IT decision-making processes. The IT project management maturity could be improved by following the project management methodologies. The stakeholder management and communication were found critical for the success of the IT projects in the university. It is hoped that the findings from this research would help the university leaders make the strategic IT decisions, and the university’s IT project managers make the IT project decisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis deals with quantifying the resilience of a network of pavements. Calculations were carried out by modeling network performance under a set of possible damage-meteorological scenarios with known probability of occurrence. Resilience evaluation was performed a priori while accounting for optimal preparedness decisions and additional response actions that can be taken under each of the scenarios. Unlike the common assumption that the pre-event condition of all system components is uniform, fixed, and pristine, component condition evolution was incorporated herein. For this purpose, the health of the individual system components immediately prior to hazard event impact, under all considered scenarios, was associated with a serviceability rating. This rating was projected to reflect both natural deterioration and any intermittent improvements due to maintenance. The scheme was demonstrated for a hypothetical case study involving Laguardia Airport. Results show that resilience can be impacted by the condition of the infrastructure elements, their natural deterioration processes, and prevailing maintenance plans. The findings imply that, in general, upper bound values are reported in ordinary resilience work, and that including evolving component conditions is of value.