900 resultados para Data Structures, Cryptology and Information Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the early nineties, Mark Weiser wrote a series of seminal papers that introduced the concept of Ubiquitous Computing. According to Weiser, computers require too much attention from the user, drawing his focus from the tasks at hand. Instead of being the centre of attention, computers should be so natural that they would vanish into the human environment. Computers become not only truly pervasive but also effectively invisible and unobtrusive to the user. This requires not only for smaller, cheaper and low power consumption computers, but also for equally convenient display solutions that can be harmoniously integrated into our surroundings. With the advent of Printed Electronics, new ways to link the physical and the digital worlds became available. By combining common printing techniques such as inkjet printing with electro-optical functional inks, it is starting to be possible not only to mass-produce extremely thin, flexible and cost effective electronic circuits but also to introduce electronic functionalities into products where it was previously unavailable. Indeed, Printed Electronics is enabling the creation of novel sensing and display elements for interactive devices, free of form factor. At the same time, the rise in the availability and affordability of digital fabrication technologies, namely of 3D printers, to the average consumer is fostering a new industrial (digital) revolution and the democratisation of innovation. Nowadays, end-users are already able to custom design and manufacture on demand their own physical products, according to their own needs. In the future, they will be able to fabricate interactive digital devices with user-specific form and functionality from the comfort of their homes. This thesis explores how task-specific, low computation, interactive devices capable of presenting dynamic visual information can be created using Printed Electronics technologies, whilst following an approach based on the ideals behind Personal Fabrication. Focus is given on the use of printed electrochromic displays as a medium for delivering dynamic digital information. According to the architecture of the displays, several approaches are highlighted and categorised. Furthermore, a pictorial computation model based on extended cellular automata principles is used to programme dynamic simulation models into matrix-based electrochromic displays. Envisaged applications include the modelling of physical, chemical, biological, and environmental phenomena.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban mobility is one of the main challenges facing urban areas due to the growing population and to traffic congestion, resulting in environmental pressures. The pathway to urban sustainable mobility involves strengthening of intermodal mobility. The integrated use of different transport modes is getting more and more important and intermodality has been mentioned as a way for public transport compete with private cars. The aim of the current dissertation is to define a set of strategies to improve urban mobility in Lisbon and by consequence reduce the environmental impacts of transports. In order to do that several intermodal practices over Europe were analysed and the transport systems of Brussels and Lisbon were studied and compared, giving special attention to intermodal systems. In the case study was gathered data from both cities in the field, by using and observing the different transport modes, and two surveys were done to the cities users. As concluded by the study, Brussels and Lisbon present significant differences. In Brussels the measures to promote intermodality are evident, while in Lisbon a lot still needs to be done. It also made clear the necessity for improvements in Lisbon’s public transports to a more intermodal passenger transport system, through integration of different transport modes and better information and ticketing system. Some of the points requiring developments are: interchanges’ waiting areas; integration of bicycle in public transport; information about correspondences with other transport modes; real-time information to passengers pre-trip and on-trip, especially in buses and trams. After the identification of the best practices in Brussels and the weaknesses in Lisbon the possibility of applying some of the practices in Brussels to Lisbon was evaluated. Brussels demonstrated to be a good example of intermodality and for that reason some of the recommendations to improve intermodal mobility in Lisbon can follow the practices in place in Brussels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper surveys the recent literature on convergence across countries and regions. I discuss the main convergence and divergence mechanisms identified in the literature and develop a simple model that illustrates their implications for income dynamics. I then review the existing empirical evidence and discuss its theoretical implications. Early optimism concerning the ability of a human capital-augmented neoclassical model to explain productivity differences across economies has been questioned on the basis of more recent contributions that make use of panel data techniques and obtain theoretically implausible results. Some recent research in this area tries to reconcile these findings with sensible theoretical models by exploring the role of alternative convergence mechanisms and the possible shortcomings of panel data techniques for convergence analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Classical treatments of problems of sequential mate choice assume that the distribution of the quality of potential mates is known a priori. This assumption, made for analytical purposes, may seem unrealistic, opposing empirical data as well as evolutionary arguments. Using stochastic dynamic programming, we develop a model that includes the possibility for searching individuals to learn about the distribution and in particular to update mean and variance during the search. In a constant environment, a priori knowledge of the parameter values brings strong benefits in both time needed to make a decision and average value of mate obtained. Knowing the variance yields more benefits than knowing the mean, and benefits increase with variance. However, the costs of learning become progressively lower as more time is available for choice. When parameter values differ between demes and/or searching periods, a strategy relying on fixed a priori information might lead to erroneous decisions, which confers advantages on the learning strategy. However, time for choice plays an important role as well: if a decision must be made rapidly, a fixed strategy may do better even when the fixed image does not coincide with the local parameter values. These results help in delineating the ecological-behavior context in which learning strategies may spread.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Knowledge about their past medical history is central for childhood cancer survivors to ensure informed decisions in their health management. Knowledge about information provision and information needs in this population is still scarce. We thus aimed to assess: (1) the information survivors reported to have received on disease, treatment, follow-up, and late effects; (2) their information needs in these four domains and the format in which they would like it provided; (3) the association with psychological distress and quality of life (QoL). PROCEDURE: As part of the Follow-up survey of the Swiss Childhood Cancer Survivor Study, we sent a questionnaire to all survivors (≥18 years) who previously participated to the baseline survey, were diagnosed with cancer after 1990 at an age of <16 years. RESULTS: Most survivors had received oral information only (on illness: oral: 82%, written: 38%, treatment: oral: 79%, written: 36%; follow-up: oral: 77%, written: 23%; late effects: oral: 68%, written: 14%). Most survivors who had not previously received any information rated it as important, especially information on late effects (71%). A large proportion of survivors reported current information needs and would like to receive personalized information especially on late effects (44%). Survivors with higher information needs reported higher psychological distress and lower QoL. CONCLUSIONS: Survivors want to be more informed especially on possible late effects, and want to receive personalized information. Improving information provision, both qualitatively and quantitatively, will allow survivors to have better control of their health and to become better decision makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Returns to scale to capital and the strength of capital externalities play a key role for the empirical predictions and policy implications of different growth theories. We show that both can be identified with individual wage data and implement our approach at the city-level using US Census data on individuals in 173 cities for 1970, 1980, and 1990. Estimation takes into account fixed effects, endogeneity of capital accumulation, and measurement error. We find no evidence for human or physical capital externalities and decreasing aggregate returns to capital. Returns to scale to physical and human capital are around 80 percent. We also find strong complementarities between human capital and labor and substantial total employment externalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Returns to scale to capital and the strength of capital externalities play a key role for the empirical predictions and policy implications of different growth theories. We show that both can be identified with individual wage data and implement our approach at the city-level using US Census data on individuals in 173 cities for 1970, 1980, and 1990. Estimation takes into account fixed effects, endogeneity of capital accumulation, and measurement error. We find no evidence for human or physical capital externalities and decreasing aggregate returns to capital. Returns to scale to physical and human capital are around 80 percent. We also find strong complementarities between human capital and labor and substantial total employment externalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the implications of a market imperfection related to the inability to establish intellectual property rights, that we label {\it unverifiable communication}. Employees are able to collude with external parties selling ``knowledge capital'' of the firm. The firm organizer engages in strategic interaction simultaneously with employees and competitors, as she introduces endogenous transaction costs in the market for information between those agents. Incentive schemes and communication costs are the key strategic variables used by the firm to induce frictions in collusive markets. Unverifiable communication introduces severe allocative distortions, both at internal product development and at intended sale of information (technology transfer). We derive implications of the model for observable decisions like characteristics of the employment relationship (full employment, incompatibility with other jobs), firms' preferences over cluster characteristics for location decisions, optimal size at entry, in--house development vs sale strategies for innovations and industry evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a georeferenced photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37°18’N). The photomosaic was generated from digital photographs acquired using the ARGO II seafloor imaging system during the 1996 LUSTRE cruise, which surveyed a ~1 km2 zone and provided a coverage of ~20% of the seafloor. The photomosaic has a pixel resolution of 15 mm and encloses the areas with known active hydrothermal venting. The final mosaic is generated after an optimization that includes the automatic detection of the same benthic features across different images (feature-matching), followed by a global alignment of images based on the vehicle navigation. We also provide software to construct mosaics from large sets of images for which georeferencing information exists (location, attitude, and altitude per image), to visualize them, and to extract data. Georeferencing information can be provided by the raw navigation data (collected during the survey) or result from the optimization obtained from imatge matching. Mosaics based solely on navigation can be readily generated by any user but the optimization and global alignment of the mosaic requires a case-by-case approach for which no universally software is available. The Lucky Strike photomosaics (optimized and navigated-only) are publicly available through the Marine Geoscience Data System (MGDS, http://www.marine-geo.org). The mosaic-generating and viewing software is available through the Computer Vision and Robotics Group Web page at the University of Girona (http://eia.udg.es/_rafa/mosaicviewer.html)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we analyze the ability of the early olfactory system to detect and discriminate different odors by means of information theory measurements applied to olfactory bulb activity images. We have studied the role that the diversity and number of receptor neuron types play in encoding chemical information. Our results show that the olfactory receptors of the biological system are low correlated and present good coverage of the input space. The coding capacity of ensembles of olfactory receptors with the same receptive range is maximized when the receptors cover half of the odor input space - a configuration that corresponds to receptors that are not particularly selective. However, the ensemble's performance slightly increases when mixing uncorrelated receptors of different receptive ranges. Our results confirm that the low correlation between sensors could be more significant than the sensor selectivity for general purpose chemo-sensory systems, whether these are biological or biomimetic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to study the connections between Lagrangian and Hamiltonian formalisms constructed from aperhaps singularhigher-order Lagrangian, some geometric structures are constructed. Intermediate spaces between those of Lagrangian and Hamiltonian formalisms, partial Ostrogradskiis transformations and unambiguous evolution operators connecting these spaces are intrinsically defined, and some of their properties studied. Equations of motion, constraints, and arbitrary functions of Lagrangian and Hamiltonian formalisms are thoroughly studied. In particular, all the Lagrangian constraints are obtained from the Hamiltonian ones. Once the gauge transformations are taken into account, the true number of degrees of freedom is obtained, both in the Lagrangian and Hamiltonian formalisms, and also in all the intermediate formalisms herein defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequencing of pools of individuals (Pool-Seq) represents a reliable and cost-effective approach for estimating genome-wide SNP and transposable element insertion frequencies. However, Pool-Seq does not provide direct information on haplotypes so that, for example, obtaining inversion frequencies has not been possible until now. Here, we have developed a new set of diagnostic marker SNPs for seven cosmopolitan inversions in Drosophila melanogaster that can be used to infer inversion frequencies from Pool-Seq data. We applied our novel marker set to Pool-Seq data from an experimental evolution study and from North American and Australian latitudinal clines. In the experimental evolution data, we find evidence that positive selection has driven the frequencies of In(3R)C and In(3R)Mo to increase over time. In the clinal data, we confirm the existence of frequency clines for In(2L)t, In(3L)P and In(3R)Payne in both North America and Australia and detect a previously unknown latitudinal cline for In(3R)Mo in North America. The inversion markers developed here provide a versatile and robust tool for characterizing inversion frequencies and their dynamics in Pool-Seq data from diverse D. melanogaster populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to describe the process and challenges in building exposure scenarios for engineered nanomaterials (ENM), using an exposure scenario format similar to that used for the European Chemicals regulation (REACH). Over 60 exposure scenarios were developed based on information from publicly available sources (literature, books, and reports), publicly available exposure estimation models, occupational sampling campaign data from partnering institutions, and industrial partners regarding their own facilities. The primary focus was on carbon-based nanomaterials, nano-silver (nano-Ag) and nano-titanium dioxide (nano-TiO2), and included occupational and consumer uses of these materials with consideration of the associated environmental release. The process of building exposure scenarios illustrated the availability and limitations of existing information and exposure assessment tools for characterizing exposure to ENM, particularly as it relates to risk assessment. This article describes the gaps in the information reviewed, recommends future areas of ENM exposure research, and proposes types of information that should, at a minimum, be included when reporting the results of such research, so that the information is useful in a wider context.