391 resultados para digital repository management


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Atlantic croaker Micropogonias undulatus is a commercially and ecologically important bottom-associated fish that occurs in marine and estuarine systems from Cape Cod, MA to Mexico. I documented the temporal and spatial variability in the diet of Atlantic croaker in Chesapeake Bay and found that in the summer fish, particularly bay anchovies Anchoa mitchilli, make up at least 20% of the diet of croaker by weight. The use of a pelagic food source seems unusual for a bottom-associated fish such as croaker, but appears to be a crepuscular feeding habit that has not been previously detected. Thus, I investigated the bioenergetic consequences of secondary piscivory to the distribution of croaker, to the condition of individuals within the population and to the ecosystem. Generalized additive models revealed that the biomass of anchovy explained some of the variability in croaker occurrence and abundance in Chesapeake Bay. However, physical factors, specifically temperature, salinity, and seasonal dynamics were stronger determinants of croaker distribution than potential prey availability. To better understand the bioenergetic consequences of diet variability at the individual level, I tested the hypothesis that croaker feeding on anchovies would be in better condition than those feeding on polychaetes using a variety of condition measures that operate on multiple time scales, including RNA:DNA, Fulton's condition factor (K), relative weight (Wr), energy density, hepatosomatic index (HSI), and gonadosomatic index (GSI). Of these condition measures, several morphometric measures were significantly positively correlated with each other and with the percentage (by weight) of anchovy in croaker diets, suggesting that the type of prey eaten is important in improving the overall condition of individual croaker. To estimate the bioenergetic consequences of diet variability on growth and consumption in croaker, I developed and validated a bioenergetic model for Atlantic croaker in the laboratory. The application of this model suggested that croaker could be an important competitor with weakfish and striped bass for food resources during the spring and summer when population abundances of these three fishes are high in Chesapeake Bay. Even though anchovies made up a relatively small portion of croaker diet and only at certain times of the year, croaker consumed more anchovy at the population level than striped bass in all simulated years and nearly as much anchovy as weakfish. This indicates that weak trophic interactions between species are important in understanding ecosystem processes and should be considered in ecosystem-based management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study, "Civil Rights on the Cell Block: Race, Reform, and Violence in Texas Prisons and the Nation, 1945-1990," offers a new perspective on the historical origins of the modern prison industrial complex, sexual violence in working-class culture, and the ways in which race shaped the prison experience. This study joins new scholarship that reperiodizes the Civil Rights era while also considering how violence and radicalism shaped the civil rights struggle. It places the criminal justice system at the heart of both an older racial order and within a prison-made civil rights movement that confronted the prison's power to deny citizenship and enforce racial hierarchies. By charting the trajectory of the civil rights movement in Texas prisons, my dissertation demonstrates how the internal struggle over rehabilitation and punishment shaped civil rights, racial formation, and the political contest between liberalism and conservatism. This dissertation offers a close case study of Texas, where the state prison system emerged as a national model for penal management. The dissertation begins with a hopeful story of reform marked by an apparently successful effort by the State of Texas to replace its notorious 1940s plantation/prison farm system with an efficient, business-oriented agricultural enterprise system. When this new system was fully operational in the 1960s, Texas garnered plaudits as a pioneering, modern, efficient, and business oriented Sun Belt state. But this reputation of competence and efficiency obfuscated the reality of a brutal system of internal prison management in which inmates acted as guards, employing coercive means to maintain control over the prisoner population. The inmates whom the prison system placed in charge also ran an internal prison economy in which money, food, human beings, reputations, favors, and sex all became commodities to be bought and sold. I analyze both how the Texas prison system managed to maintain its high external reputation for so long in the face of the internal reality and how that reputation collapsed when inmates, inspired by the Civil Rights Movement, revolted. My dissertation shows that this inmate Civil Rights rebellion was a success in forcing an end to the existing system but a failure in its attempts to make conditions in Texas prisons more humane. The new Texas prison regime, I conclude, utilized paramilitary practices, privatized prisons, and gang-related warfare to establish a new system that focused much more on law and order in the prisons than on the legal and human rights of prisoners. Placing the inmates and their struggle at the heart of the national debate over rights and "law and order" politics reveals an inter-racial social justice movement that asked the courts to reconsider how the state punished those who committed a crime while also reminding the public of the inmates' humanity and their constitutional rights.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Gemstone Team Future Firefighting Advancements

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Gemstone Team GREEN JUSTICE

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Technology-supported citizen science has created huge volumes of data with increasing potential to facilitate scientific progress, however, verifying data quality is still a substantial hurdle due to the limitations of existing data quality mechanisms. In this study, we adopted a mixed methods approach to investigate community-based data validation practices and the characteristics of records of wildlife species observations that affected the outcomes of collaborative data quality management in an online community where people record what they see in the nature. The findings describe the processes that both relied upon and added to information provenance through information stewardship behaviors, which led to improved reliability and informativity. The likelihood of community-based validation interactions were predicted by several factors, including the types of organisms observed and whether the data were submitted from a mobile device. We conclude with implications for technology design, citizen science practices, and research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

E-books on their own are complex; they become even more so in the context of course reserves. In FY2016 the Resource Sharing & Reserves and Acquisitions units developed a new workflow for vetting requested e-books to ensure that they were suitable for course reserves (i.e. they permit unlimited simultaneous users) before posting links to them within the university’s online learning management system. In the Spring 2016 semester 46 e-books were vetted through this process, resulting in 18 purchases. Preliminary data analysis sheds light on the suitability of the Libraries’ current e-book collections for course reserves as well as faculty preferences, with potential implications for the Libraries’ ordering process. We hope this lightening talk will generate discussion about these issues among selectors, collection managers, and reserves staff alike.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ability to manipulate gene expression promises to be an important tool for the management of infectious diseases and genetic disorders. However, a major limitation to effective delivery of therapeutic RNA to living cells is the cellular toxicity of conventional techniques. Team PANACEA’s research objective was to create new reagents based on a novel small-molecule delivery system that uses a modular recombinant protein vehicle consisting of a specific ligand coupled to a Hepatitis B Virus-derived RNA binding domain (HBV-RBD). Two such recombinant delivery proteins were developed: one composed of Interleukin-8, the other consisting of the Machupo Virus GP1 protein. The ability of these proteins to deliver RNA to cells were then tested. The non-toxic nature of this technology has the potential to overcome limitations of current methods and could provide a platform for the expansion of personalized medicine.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A synopsis and preparation presentation of a Society of American Archivists pre-conference workshop (workshop held on August 17, 2015.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Steel slag, an abundant by-product of the steel-making industry, after it is aged, has a huge potential for use as an aggregate in road construction. However, the high pH of steel slag seepage (pH≥12) is a major impediment in its beneficial use. Analyses on aged steel slag samples demonstrated that the alkalinity producing capacity of aged steel slag samples strongly correlated to Ca(OH)2 dissolution and that prolonged aging periods have marginal effects on overall alkalinity. Treatment methods that included bitumen-coating, bathing in Al(III) solutions and addition of an alum-based drinking water treatment residual (WTR) were evaluated based on reduction in pH levels and leachate alkalinity. 10% (wt./wt.) alum-based drinking water treatment residual (WTR) addition to slag was determined to be the most successful mitigation method, providing 65−70% reduction in alkalinity both in batch-type and column leach tests, but final leachate pH was only 0.5−1 units lower and leachates were contaminated by dissolved Al(+III) (≥3−4 mM). Based on the interpretation of calculated saturation indices and SEM and EDX analyses, formation of calcium sulfoaluminate phases (i.e., ettringite and monosulfate) was suggested as the mechanism behind alkalinity mitigation upon WTR-modification. The residual alkalinity in WTR-amended slag leachates was able to be completely eliminated utilizing a biosolids compost with high base neutralization capacity. In column leach tests, effluent pH levels below 7 were maintained for 58−74 pore volumes worth of WTR-amended slag leachate using 0.13 kg compost (dry wt.) per 1 kg WTR-amended slag on average; also, dissolved Al(+III) was strongly retained on the compost.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Deployment of low power basestations within cellular networks can potentially increase both capacity and coverage. However, such deployments require efficient resource allocation schemes for managing interference from the low power and macro basestations that are located within each other’s transmission range. In this dissertation, we propose novel and efficient dynamic resource allocation algorithms in the frequency, time and space domains. We show that the proposed algorithms perform better than the current state-of-art resource management algorithms. In the first part of the dissertation, we propose an interference management solution in the frequency domain. We introduce a distributed frequency allocation scheme that shares frequencies between macro and low power pico basestations, and guarantees a minimum average throughput to users. The scheme seeks to minimize the total number of frequencies needed to honor the minimum throughput requirements. We evaluate our scheme using detailed simulations and show that it performs on par with the centralized optimum allocation. Moreover, our proposed scheme outperforms a static frequency reuse scheme and the centralized optimal partitioning between the macro and picos. In the second part of the dissertation, we propose a time domain solution to the interference problem. We consider the problem of maximizing the alpha-fairness utility over heterogeneous wireless networks (HetNets) by jointly optimizing user association, wherein each user is associated to any one transmission point (TP) in the network, and activation fractions of all TPs. Activation fraction of a TP is the fraction of the frame duration for which it is active, and together these fractions influence the interference seen in the network. To address this joint optimization problem which we show is NP-hard, we propose an alternating optimization based approach wherein the activation fractions and the user association are optimized in an alternating manner. The subproblem of determining the optimal activation fractions is solved using a provably convergent auxiliary function method. On the other hand, the subproblem of determining the user association is solved via a simple combinatorial algorithm. Meaningful performance guarantees are derived in either case. Simulation results over a practical HetNet topology reveal the superior performance of the proposed algorithms and underscore the significant benefits of the joint optimization. In the final part of the dissertation, we propose a space domain solution to the interference problem. We consider the problem of maximizing system utility by optimizing over the set of user and TP pairs in each subframe, where each user can be served by multiple TPs. To address this optimization problem which is NP-hard, we propose a solution scheme based on difference of submodular function optimization approach. We evaluate our scheme using detailed simulations and show that it performs on par with a much more computationally demanding difference of convex function optimization scheme. Moreover, the proposed scheme performs within a reasonable percentage of the optimal solution. We further demonstrate the advantage of the proposed scheme by studying its performance with variation in different network topology parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Contemporary integrated circuits are designed and manufactured in a globalized environment leading to concerns of piracy, overproduction and counterfeiting. One class of techniques to combat these threats is circuit obfuscation which seeks to modify the gate-level (or structural) description of a circuit without affecting its functionality in order to increase the complexity and cost of reverse engineering. Most of the existing circuit obfuscation methods are based on the insertion of additional logic (called “key gates”) or camouflaging existing gates in order to make it difficult for a malicious user to get the complete layout information without extensive computations to determine key-gate values. However, when the netlist or the circuit layout, although camouflaged, is available to the attacker, he/she can use advanced logic analysis and circuit simulation tools and Boolean SAT solvers to reveal the unknown gate-level information without exhaustively trying all the input vectors, thus bringing down the complexity of reverse engineering. To counter this problem, some ‘provably secure’ logic encryption algorithms that emphasize methodical selection of camouflaged gates have been proposed previously in literature [1,2,3]. The contribution of this paper is the creation and simulation of a new layout obfuscation method that uses don't care conditions. We also present proof-of-concept of a new functional or logic obfuscation technique that not only conceals, but modifies the circuit functionality in addition to the gate-level description, and can be implemented automatically during the design process. Our layout obfuscation technique utilizes don’t care conditions (namely, Observability and Satisfiability Don’t Cares) inherent in the circuit to camouflage selected gates and modify sub-circuit functionality while meeting the overall circuit specification. Here, camouflaging or obfuscating a gate means replacing the candidate gate by a 4X1 Multiplexer which can be configured to perform all possible 2-input/ 1-output functions as proposed by Bao et al. [4]. It is important to emphasize that our approach not only obfuscates but alters sub-circuit level functionality in an attempt to make IP piracy difficult. The choice of gates to obfuscate determines the effort required to reverse engineer or brute force the design. As such, we propose a method of camouflaged gate selection based on the intersection of output logic cones. By choosing these candidate gates methodically, the complexity of reverse engineering can be made exponential, thus making it computationally very expensive to determine the true circuit functionality. We propose several heuristic algorithms to maximize the RE complexity based on don’t care based obfuscation and methodical gate selection. Thus, the goal of protecting the design IP from malicious end-users is achieved. It also makes it significantly harder for rogue elements in the supply chain to use, copy or replicate the same design with a different logic. We analyze the reverse engineering complexity by applying our obfuscation algorithm on ISCAS-85 benchmarks. Our experimental results indicate that significant reverse engineering complexity can be achieved at minimal design overhead (average area overhead for the proposed layout obfuscation methods is 5.51% and average delay overhead is about 7.732%). We discuss the strengths and limitations of our approach and suggest directions that may lead to improved logic encryption algorithms in the future. References: [1] R. Chakraborty and S. Bhunia, “HARPOON: An Obfuscation-Based SoC Design Methodology for Hardware Protection,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 28, no. 10, pp. 1493–1502, 2009. [2] J. A. Roy, F. Koushanfar, and I. L. Markov, “EPIC: Ending Piracy of Integrated Circuits,” in 2008 Design, Automation and Test in Europe, 2008, pp. 1069–1074. [3] J. Rajendran, M. Sam, O. Sinanoglu, and R. Karri, “Security Analysis of Integrated Circuit Camouflaging,” ACM Conference on Computer Communications and Security, 2013. [4] Bao Liu, Wang, B., "Embedded reconfigurable logic for ASIC design obfuscation against supply chain attacks,"Design, Automation and Test in Europe Conference and Exhibition (DATE), 2014 , vol., no., pp.1,6, 24-28 March 2014.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traffic demand increases are pushing aging ground transportation infrastructures to their theoretical capacity. The result of this demand is traffic bottlenecks that are a major cause of delay on urban freeways. In addition, the queues associated with those bottlenecks increase the probability of a crash while adversely affecting environmental measures such as emissions and fuel consumption. With limited resources available for network expansion, traffic professionals have developed active traffic management systems (ATMS) in an attempt to mitigate the negative consequences of traffic bottlenecks. Among these ATMS strategies, variable speed limits (VSL) and ramp metering (RM) have been gaining international interests for their potential to improve safety, mobility, and environmental measures at freeway bottlenecks. Though previous studies have shown the tremendous potential of variable speed limit (VSL) and VSL paired with ramp metering (VSLRM) control, little guidance has been developed to assist decision makers in the planning phase of a congestion mitigation project that is considering VSL or VSLRM control. To address this need, this study has developed a comprehensive decision/deployment support tool for the application of VSL and VSLRM control in recurrently congested environments. The decision tool will assist practitioners in deciding the most appropriate control strategy at a candidate site, which candidate sites have the most potential to benefit from the suggested control strategy, and how to most effectively design the field deployment of the suggested control strategy at each implementation site. To do so, the tool is comprised of three key modules, (1) Decision Module, (2) Benefits Module, and (3) Deployment Guidelines Module. Each module uses commonly known traffic flow and geometric parameters as inputs to statistical models and empirically based procedures to provide guidance on the application of VSL and VSLRM at each candidate site. These models and procedures were developed from the outputs of simulated experiments, calibrated with field data. To demonstrate the application of the tool, a list of real-world candidate sites were selected from the Maryland State Highway Administration Mobility Report. Here, field data from each candidate site was input into the tool to illustrate the step-by-step process required for efficient planning of VSL or VSLRM control. The output of the tool includes the suggested control system at each site, a ranking of the sites based on the expected benefit-to-cost ratio, and guidelines on how to deploy the VSL signs, ramp meters, and detectors at the deployment site(s). This research has the potential to assist traffic engineers in the planning of VSL and VSLRM control, thus enhancing the procedure for allocating limited resources for mobility and safety improvements on highways plagued by recurrent congestion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Asian elephants (Elephas maximus) are critically endangered and live in fragmented populations spread across 13 countries. Yet in comparison to the African savannah elephant (Loxodonta africana), relatively little is known about the social structure of wild Asian elephants because the species is mostly found in low visibility habitat. A better understanding of Asian elephant social structure is critical to mitigate human-elephant conflicts that arise due to increasing human encroachments into elephant habitats. In this dissertation, I examined the social structure of Asian elephants at three sites: Yala, Udawalawe, and Minneriya National Parks in Sri Lanka, where the presence of large open areas and high elephant densities are conducive to behavioral observations. First, I found that the size of groups observed at georeferenced locations was affected by forage availability and distance to water, and the effects of these environmental factors on group size depended on site. Second, I discovered that while populations at different sites differed in the prevalence of weak associations among individuals, a core social structure of individuals sharing strong bonds and organized into highly independent clusters was present across sites. Finally, I showed that the core social structure preserved across sites was typically composed of adult females associating with each other and with other age-sex classes. In addition, I showed that females are social at all life stages, whereas males gradually transition from living in a group to a more solitary lifestyle. Taking into consideration these elements of Asian elephant social structure will help conservation biologists develop effective management strategies that account for both human needs and the socio-ecology of the elephants.