937 resultados para Threshold crypto-graphic schemes and algorithms
Resumo:
The aim of this investigation is to analyze the use of the blog as an educational resource for the development of the mathematical communication in secondary education. With this aim, four aspects are analyzed: organization of mathematical thinking through communication; communication of mathematical thinking; analysis and evaluation of the strategies and mathematical thought of others; and expression of mathematical ideas using mathematical language. The research was conducted from a qualitative approach on an exploratory level, with the case study method of 4 classrooms of second grade of secondary education in a private school in Lima. The observational technique of 20 publications in the blog of the math class was applied; a study of a focal group with a sample of 9 students with different levels of academic performance; and an interview with the academic coordinator of the school was conducted. The results show that the organization of mathematical thinking through communication is carried out in the blog in a written, graphical and oral way through explanations, schemes and videos. Regarding communication of mathematical thinking, the blog is used to describe concepts, arguments and mathematical procedures with words and examples of the students. The analysis and evaluation of the strategies and mathematical thinking is performed through comments and debates about the publications. It was also noted that the blog does not facilitate the use of mathematical language to express mathematical ideas, since it does not allow direct writing of symbols nor graphic representation.
Resumo:
The spectroscopy and metastability of the carbon dioxide doubly charged ion, the CO 2 2+ dication, have been studied with photoionization experiments: time-of-flight photoelectron photoelectron coincidence (TOF-PEPECO), threshold photoelectrons coincidence (TPEsCO), and threshold photoelectrons and ion coincidence (TPEsCO ion coincidence) spectroscopies. Vibrational structure is observed in TOF-PEPECO and TPEsCO spectra of the ground and first two excited states. The vibrational structure is dominated by the symmetric stretch except in the TPEsCO spectrum of the ground state where an antisymmetric stretch progression is observed. All three vibrational frequencies are deduced for the ground state and symmetric stretch and bending frequencies are deduced for the first two excited states. Some vibrational structure of higher electronic states is also observed. The threshold for double ionization of carbon dioxide is reported as 37.340±0.010 eV. The fragmentation of energy selected CO 2 2+ ions has been investigated with TPEsCO ion coincidence spectroscopy. A band of metastable states from ∼38.7 to ∼41 eV above the ground state of neutral CO 2 has been observed in the experimental time window of ∼0.1-2.3 μs with a tendency towards shorter lifetimes at higher energies. It is proposed that the metastability is due to slow spin forbidden conversion from bound excited singlet states to unbound continuum states of the triplet ground state. Another result of this investigation is the observation of CO ++O + formation in indirect dissociative double photoionization below the threshold for formation of CO 2 2+. The threshold for CO ++O + formation is found to be 35.56±0.10 eV or lower, which is more than 2 eV lower than previous measurements.
Resumo:
While the connection between seriality and comics in the twentieth century has frequently been a subject of study, far less attention has been paid to the role of serialisation in the previous century, when the language of comics gradually developed in the illustrated and satirical press. This article discusses a heterogeneous group of graphic narratives published in various European countries between the 1830s and the 1880s, before a new generation of comic magazines influenced by American newspaper strips transformed this emerging field into the autonomous medium of comics. Serial works flourished during this period and included diverse modes, such as series of “graphic novels,” the use of recurring characters, the serialisation of picture stories in humour periodicals, and the use of graphic narratives as a regular feature in the illustrated news magazines. By providing a panoramic survey of various types of serial texts, the article suggests that the hybrid nature of these graphic narratives and the publishing strategies applied to them can be better understood if considered in relation to the larger context of nineteenth-century print culture, rather than in comparison with the future of the medium.
Resumo:
This thesis analyses the influence of qualitative and quantitative herbage production on seasonal rangelands, and of herd and pasture use strategies on feed intake, body mass development and reproductive performance of sheep and goats in the Altai mountain region of Bulgan county (soum) in Khovd province (aimag). This westernmost county of Mongolia is characterized by a very poor road network and thus very difficult access to regional and national markets. The thesis explores in this localized context the current rural development, the economic settings and political measures that affect the traditional extensive livestock husbandry system and its importance for rural livelihoods. Livestock management practices still follow the traditional transhumant mode, fully relying on natural pasture. This renders animal feeding very vulnerable to the highly variable climatic conditions which is one of many reasons for gradually declining quantity and quality of pasture vegetation. Small ruminants, and especially goats, are the main important species securing economic viability of their owners’ livelihood, and they are well adapted to the harsh continental climate and the present low input management practices. It is likely that small ruminants will keep their vital role for the rural community in the future, since the weak local infrastructure and slow market developments currently do not allow many income diversification options. Since the profitability of a single animal is low, animal numbers tend to increase, whereas herd management does not change. Possibilities to improve the current livestock management and thus herders’ livelihoods in an environmentally, economically and socially sustainable manner are simulated through bio-economic modelling and the implications are discussed at the regional and national scale. To increase the welfare of the local population, a substantial infrastructural and market development is needed, which needs to be accompanied by suitable pasture management schemes and policies
Resumo:
Hybrid simulation is a technique that combines experimental and numerical testing and has been used for the last decades in the fields of aerospace, civil and mechanical engineering. During this time, most of the research has focused on developing algorithms and the necessary technology, including but not limited to, error minimisation techniques, phase lag compensation and faster hydraulic cylinders. However, one of the main shortcomings in hybrid simulation that has pre- vented its widespread use is the size of the numerical models and the effect that higher frequencies may have on the stability and accuracy of the simulation. The first chapter in this document provides an overview of the hybrid simulation method and the different hybrid simulation schemes, and the corresponding time integration algorithms, that are more commonly used in this field. The scope of this thesis is presented in more detail in chapter 2: a substructure algorithm, the Substep Force Feedback (Subfeed), is adapted in order to fulfil the necessary requirements in terms of speed. The effects of more complex models on the Subfeed are also studied in detail, and the improvements made are validated experimentally. Chapters 3 and 4 detail the methodologies that have been used in order to accomplish the objectives mentioned in the previous lines, listing the different cases of study and detailing the hardware and software used to experimentally validate them. The third chapter contains a brief introduction to a project, the DFG Subshake, whose data have been used as a starting point for the developments that are shown later in this thesis. The results obtained are presented in chapters 5 and 6, with the first of them focusing on purely numerical simulations while the second of them is more oriented towards a more practical application including experimental real-time hybrid simulation tests with large numerical models. Following the discussion of the developments in this thesis is a list of hardware and software requirements that have to be met in order to apply the methods described in this document, and they can be found in chapter 7. The last chapter, chapter 8, of this thesis focuses on conclusions and achievements extracted from the results, namely: the adaptation of the hybrid simulation algorithm Subfeed to be used in conjunction with large numerical models, the study of the effect of high frequencies on the substructure algorithm and experimental real-time hybrid simulation tests with vibrating subsystems using large numerical models and shake tables. A brief discussion of possible future research activities can be found in the concluding chapter.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.
Resumo:
In the last two decades, experimental progress in controlling cold atoms and ions now allows us to manipulate fragile quantum systems with an unprecedented degree of precision. This has been made possible by the ability to isolate small ensembles of atoms and ions from noisy environments, creating truly closed quantum systems which decouple from dissipative channels. However in recent years, several proposals have considered the possibility of harnessing dissipation in open systems, not only to cool degenerate gases to currently unattainable temperatures, but also to engineer a variety of interesting many-body states. This thesis will describe progress made towards building a degenerate gas apparatus that will soon be capable of realizing these proposals. An ultracold gas of ytterbium atoms, trapped by a species-selective lattice will be immersed into a Bose-Einstein condensate (BEC) of rubidium atoms which will act as a bath. Here we describe the challenges encountered in making a degenerate mixture of rubidium and ytterbium atoms and present two experiments performed on the path to creating a controllable open quantum system. The first experiment will describe the measurement of a tune-out wavelength where the light shift of $\Rb{87}$ vanishes. This wavelength was used to create a species-selective trap for ytterbium atoms. Furthermore, the measurement of this wavelength allowed us to extract the dipole matrix element of the $5s \rightarrow 6p$ transition in $\Rb{87}$ with an extraordinary degree of precision. Our method to extract matrix elements has found use in atomic clocks where precise knowledge of transition strengths is necessary to account for minute blackbody radiation shifts. The second experiment will present the first realization of a degenerate Bose-Fermi mixture of rubidium and ytterbium atoms. Using a three-color optical dipole trap (ODT), we were able to create a highly-tunable, species-selective potential for rubidium and ytterbium atoms which allowed us to use $\Rb{87}$ to sympathetically cool $\Yb{171}$ to degeneracy with minimal loss. This mixture is the first milestone creating the lattice-bath system and will soon be used to implement novel cooling schemes and explore the rich physics of dissipation.
Resumo:
Hydroxyl radical (OH) is the primary oxidant in the troposphere, initiating the removal of numerous atmospheric species including greenhouse gases, pollutants that are detrimental to human health, and ozone-depleting substances. Because of the complexity of OH chemistry, models vary widely in their OH chemistry schemes and resulting methane (CH4) lifetimes. The current state of knowledge concerning global OH abundances is often contradictory. This body of work encompasses three projects that investigate tropospheric OH from a modeling perspective, with the goal of improving the tropospheric community’s knowledge of the atmospheric lifetime of CH4. First, measurements taken during the airborne CONvective TRansport of Active Species in the Tropics (CONTRAST) field campaign are used to evaluate OH in global models. A box model constrained to measured variables is utilized to infer concentrations of OH along the flight track. Results are used to evaluate global model performance, suggest against the existence of a proposed “OH Hole” in the tropical Western Pacific, and investigate implications of high O3/low H2O filaments on chemical transport to the stratosphere. While methyl chloroform-based estimates of global mean OH suggest that models are overestimating OH, we report evidence that these models are actually underestimating OH in the tropical Western Pacific. The second project examines OH within global models to diagnose differences in CH4 lifetime. I developed an approach to quantify the roles of OH precursor field differences (O3, H2O, CO, NOx, etc.) using a neural network method. This technique enables us to approximate the change in CH4 lifetime resulting from variations in individual precursor fields. The dominant factors driving CH4 lifetime differences between models are O3, CO, and J(O3-O1D). My third project evaluates the effect of climate change on global fields of OH using an empirical model. Observations of H2O and O3 from satellite instruments are combined with a simulation of tropical expansion to derive changes in global mean OH over the past 25 years. We find that increasing H2O and increasing width of the tropics tend to increase global mean OH, countering the increasing CH4 sink and resulting in well-buffered global tropospheric OH concentrations.
Resumo:
Scalable Vector Graphics (SVG) has an imaging model similar to that of PostScript and PDF but the XML basis of SVG allows it to participate fully, via namespaces, in generalised XML documents.There is increasing interest in using SVG as a Page Description Language and we examine ways in which SVG document components can be encapsulated in contexts where SVG will be used as a rendering technology for conventional page printing.Our aim is to encapsulate portions of SVG content (SVG COGs) so that the COGs are mutually independent and can be moved around a page, while maintaining invariant graphic properties and with guaranteed freedom from side effects and mutual interference. Parellels are drawn between COG implementation within SVG's tree-based inheritance mechanisms and an earlier COG implementation using PDF.
Resumo:
This paper, based on the outcome of discussions at a NORMAN Network-supported workshop in Lyon (France) in November 2014 aims to provide a common position of passive sampling community experts regarding concrete actions required to foster the use of passive sampling techniques in support of contaminant risk assessment and management and for routine monitoring of contaminants in aquatic systems. The brief roadmap presented here focusses on the identification of robust passive sampling methodology, technology that requires further development or that has yet to be developed, our current knowledge of the evaluation of uncertainties when calculating a freely dissolved concentration, the relationship between data from PS and that obtained through biomonitoring. A tiered approach to identifying areas of potential environmental quality standard (EQS) exceedances is also shown. Finally, we propose a list of recommended actions to improve the acceptance of passive sampling by policy-makers. These include the drafting of guidelines, quality assurance and control procedures, developing demonstration projects where biomonitoring and passive sampling are undertaken alongside, organising proficiency testing schemes and interlaboratory comparison and, finally, establishing passive sampler-based assessment criteria in relation to existing EQS.
Resumo:
The In Situ Analysis System (ISAS) was developed to produce gridded fields of temperature and salinity that preserve as much as possible the time and space sampling capabilities of the Argo network of profiling floats. Since the first global re-analysis performed in 2009, the system has evolved and a careful delayed mode processing of the 2002-2012 dataset has been carried out using version 6 of ISAS and updating the statistics to produce the ISAS13 analysis. This last version is now implemented as the operational analysis tool at the Coriolis data centre. The robustness of the results with respect to the system evolution is explored through global quantities of climatological interest: the Ocean Heat Content and the Steric Height. Estimates of errors consistent with the methodology are computed. This study shows that building reliable statistics on the fields is fundamental to improve the monthly estimates and to determine the absolute error bars. The new mean fields and variances deduced from the ISAS13 re-analysis and dataset show significant changes relative to the previous ISAS estimates, in particular in the southern ocean, justifying the iterative procedure. During the decade covered by Argo, the intermediate waters appear warmer and saltier in the North Atlantic and fresher in the Southern Ocean than in WOA05 long term mean. At inter-annual scale, the impact of ENSO on the Ocean Heat Content and Steric Height is observed during the 2006-2007 and 2009-2010 events captured by the network.
Resumo:
This dissertation seeks to discern the impact of social housing on public health in the cities of Glasgow, Scotland and Baltimore, Maryland in the twentieth century. Additionally, this dissertation seeks to compare the impact of social housing policy implementation in both cities, to determine the efficacy of social housing as a tool of public health betterment. This is accomplished through the exposition and evaluation of the housing and health trends of both cities over the course of the latter half of the twentieth century. Both the cities of Glasgow and Baltimore had long struggled with both overcrowded slum districts and relatively unhealthy populations. Early commentators had noticed the connection between insanitary housing and poor health, and sought a solution to both of these problems. Beginning in the 1940s, housing reform advocates (self-dubbed ‘housers') pressed for the development of social housing, or municipally-controlled housing for low-income persons, to alleviate the problems of overcrowded slum dwellings in both cities. The impetus for social housing was twofold: to provide affordable housing to low-income persons and to provide housing that would facilitate healthy lives for tenants. Whether social housing achieved these goals is the crux of this dissertation. In the immediate years following the Second World War, social housing was built en masse in both cities. Social housing provided a reprieve from slum housing for both working-class Glaswegians and Baltimoreans. In Baltimore specifically, social housing provided accommodation for the city’s Black residents, who found it difficult to occupy housing in White neighbourhoods. As the years progressed, social housing developments in both cities faced unexpected problems. In Glasgow, stable tenant flight (including both middle class and skilled artisan workers)+ resulted in a concentration of poverty in the city’s housing schemes, and in Baltimore, a flight of White tenants of all income levels created a new kind of state subsidized segregated housing stock. The implementation of high-rise tower blocks in both cities, once heralded as a symbol of housing modernity, also faced increased scrutiny in the 1960s and 1970s. During the period of 1940-1980, before policy makers in the United States began to eschew social housing for subsidized private housing vouchers and community based housing associations had truly taken off in Britain, public health professionals conducted academic studies of the impact of social housing tenancy on health. Their findings provide the evidence used to assess the second objective of social housing provision, as outlined above. Put simply, while social housing units were undoubtedly better equipped than slum dwellings in both cities, the public health investigations into the impact of rehousing slum dwellers into social housing revealed that social housing was not a panacea for each city’s social and public health problems.
Resumo:
Background: Thrombocytopenia has been shown to predict mortality. We hypothesize that platelet indices may be more useful prognostic indicators. Our study subjects were children one month to 14 years old admitted to our hospital. Aim: To determine whether platelet count, plateletcrit (PCT), mean platelet volume (MPV) and platelet distribution width (PDW) and their ratios can predict mortality in hospitalised children. Methods: Children who died during hospital stay were the cases. Controls were age matched children admitted contemporaneously. The first blood sample after admission was used for analysis. Receiver operating characteristic (ROC) curve was used to identify the best threshold for measured variables and the ratios studied. Multiple regression analysis was done to identify independent predictors of mortality. Results: Forty cases and forty controls were studied. Platelet count, PCT and the ratios of MPV/Platelet count, MPV/PCT, PDW/Platelet count, PDW/PCT and MPV x PDW/Platelet count x PCT were significantly different among children who survived compared to those who died. On multiple regression analysis the ratio of MPV/PCT, PDW/Platelet count and MPV/ Platelet count were risk factors for mortality with an odds ratio of 4.31(95% CI, 1.69-10.99), 3.86 (95% CI, 1.53-9.75), 3.45 (95% CI, 1.38-8.64) respectively. In 67% of the patients who died MPV/PCT ratio was above 41.8 and PDW/Platelet count was above 3.86. In 65% of patients who died MPV/Platelet count was above 3.45. Conclusion: The MPV/PCT, PDW/Platelet count and MPV/Platelet count, in the first sample after admission in this case control study were predictors of mortality and could predict 65% to 67% of deaths accurately.
Resumo:
Queueing theory provides models, structural insights, problem solutions and algorithms to many application areas. Due to its practical applicability to production, manufacturing, home automation, communications technology, etc, more and more complex systems requires more elaborated models, tech- niques, algorithm, etc. need to be developed. Discrete-time models are very suitable in many situations and a feature that makes the analysis of discrete time systems technically more involved than its continuous time counterparts. In this paper we consider a discrete-time queueing system were failures in the server can occur as-well as priority messages. The possibility of failures of the server with general life time distribution is considered. We carry out an extensive study of the system by computing generating functions for the steady-state distribution of the number of messages in the queue and in the system. We also obtain generating functions for the stationary distribution of the busy period and sojourn times of a message in the server and in the system. Performance measures of the system are also provided.