61 resultados para classical conditioning, mere exposure effect, classical conditioning of preferences.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. The phylogeography of freshwater taxa is often integrally linked with landscape changes such as drainage re-alignments that may present the only avenue for historical dispersal for these taxa. Classical models of gene flow do not account for landscape changes and so are of little use in predicting phylogeography in geologically young freshwater landscapes. When the history of drainage formation is unknown, phylogeographical predictions can be based on current freshwater landscape structure, proposed historical drainage geomorphology, or from phylogeographical patterns of co-distributed taxa. 2. This study describes the population structure of a sedentary freshwater fish, the chevron snakehead (Channa striata), across two river drainages on the Indochinese Peninsula. The phylogeographical pattern recovered for C. striata was tested against seven hypotheses based on contemporary landscape structure, proposed history and phylogeographical patterns of codistributed taxa. 3. Consistent with the species ecology, analysis of mitochondrial and microsatellite loci revealed very high differentiation among all sampled sites. A strong signature of historical population subdivision was also revealed within the contemporary Mekong River Basin (MRB). Of the seven phylogeographical hypotheses tested, patterns of co-distributed taxa proved to be the most adequate for describing the phylogeography of C. striata. 4. Results shed new light on SE Asian drainage evolution, indicating that the Middle MRB probably evolved via amalgamation of at least three historically independent drainage sections and in particular that the Mekong River section centred around the northern Khorat Plateau in NE Thailand was probably isolated from the greater Mekong for an extensive period of evolutionary time. In contrast, C. striata populations in the Lower MRB do not show a phylogeographical signature of evolution in historically isolated drainage lines, suggesting drainage amalgamation has been less important for river landscape formation in this region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Significant wheel-rail dynamic forces occur because of imperfections in the wheels and/or rail. One of the key responses to the transmission of these forces down through the track is impact force on the sleepers. Dynamic analysis of nonlinear systems is very complicated and does not lend itself easily to a classical solution of multiple equations. Trying to deduce the behaviour of track components from experimental data is very difficult because such data is hard to obtain and applies to only the particular conditions of the track being tested. The finite element method can be the best solution to this dilemma. This paper describes a finite element model using the software package ANSYS for various sized flat defects in the tread of a wheel rolling at a typical speed on heavy haul track. The paper explores the dynamic response of a prestressed concrete sleeper to these defects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Ultraviolet radiation exposure during an individuals' lifetime is a known risk factor for the development of skin cancer. However, less evidence is available on assessing the relationship between lifetime sun exposure and skin damage and skin aging. Objectives: This study aims to assess the relationship between lifetime sun exposure and skin damage and skin aging using a non-invasive measure of exposure. Methods: We recruited 180 participants (73 males, 107 females) aged 18-83 years. Digital imaging of skin hyper-pigmentation (skin damage) and skin wrinkling (skin aging) on the facial region was measured. Lifetime sun exposure (presented as hours) was calculated from the participants' age multiplied by the estimated annual time outdoors for each year of life. We analyzed the effects of lifetime sun exposure on skin damage and skin aging. We adjust for the influence of age, sex, occupation, history of skin cancer, eye color, hair color, and skin color. Results: There were non-linear relationships between lifetime sun exposure and skin damage and skin aging. Younger participant's skin is much more sensitive to sun exposure than those who were over 50 years of age. As such, there were negative interactions between lifetime sun exposure and age. Age had linear effects on skin damage and skin aging. Conclusion: The data presented showed that self reported lifetime sun exposure was positively associated with skin damage and skin aging, in particular, the younger people. Future health promotion for sun exposure needs to pay attention to this group for skin cancer prevention messaging. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transport processes within heterogeneous media may exhibit non-classical diffusion or dispersion; that is, not adequately described by the classical theory of Brownian motion and Fick's law. We consider a space fractional advection-dispersion equation based on a fractional Fick's law. The equation involves the Riemann-Liouville fractional derivative which arises from assuming that particles may make large jumps. Finite difference methods for solving this equation have been proposed by Meerschaert and Tadjeran. In the variable coefficient case, the product rule is first applied, and then the Riemann-Liouville fractional derivatives are discretised using standard and shifted Grunwald formulas, depending on the fractional order. In this work, we consider a finite volume method that deals directly with the equation in conservative form. Fractionally-shifted Grunwald formulas are used to discretise the fractional derivatives at control volume faces. We compare the two methods for several case studies from the literature, highlighting the convenience of the finite volume approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An accurate evaluation of the airborne particle dose-response relationship requires detailed measurements of the actual particle concentration levels that people are exposed to, in every microenvironment in which they reside. The aim of this work was to perform an exposure assessment of children in relation to two different aerosol species: ultrafine particles (UFPs) and black carbon (BC). To this purpose, personal exposure measurements, in terms of UFP and BC concentrations, were performed on 103 children aged 8-11 years (10.1 ± 1.1 years) using hand-held particle counters and aethalometers. Simultaneously, a time-activity diary and a portable GPS were used to determine the children’s daily time-activity pattern and estimate their inhaled dose of UFPs and BC. The median concentration to which the study population was exposed was found to be comparable to the high levels typically detected in urban traffic microenvironments, in terms of both particle number (2.2×104 part. cm-3) and BC (3.8 μg m-3) concentrations. Daily inhaled doses were also found to be relatively high and were equal to 3.35×1011 part. day-1 and 3.92×101 μg day-1 for UFPs and BC, respectively. Cooking and using transportation were recognized as the main activities contributing to overall daily exposure, when normalized according to their corresponding time contribution for UFPs and BC, respectively. Therefore, UFPs and BC could represent tracers of children exposure to particulate pollution from indoor cooking activities and transportation microenvironments, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper takes its root in a trivial observation: management approaches are unable to provide relevant guidelines to cope with uncertainty, and trust of our modern worlds. Thus, managers are looking for reducing uncertainty through information’s supported decision-making, sustained by ex-ante rationalization. They strive to achieve best possible solution, stability, predictability, and control of “future”. Hence, they turn to a plethora of “prescriptive panaceas”, and “management fads” to bring simple solutions through best practices. However, these solutions are ineffective. They address only one part of a system (e.g. an organization) instead of the whole. They miss the interactions and interdependencies with other parts leading to “suboptimization”. Further classical cause-effects investigations and researches are not very helpful to this regard. Where do we go from there? In this conversation, we want to challenge the assumptions supporting the traditional management approaches and shed some lights on the problem of management discourse fad using the concept of maturity and maturity models in the context of temporary organizations as support for reflexion. Global economy is characterized by use and development of standards and compliance to standards as a practice is said to enable better decision-making by managers in uncertainty, control complexity, and higher performance. Amongst the plethora of standards, organizational maturity and maturity models hold a specific place due to general belief in organizational performance as dependent variable of (business) processes continuous improvement, grounded on a kind of evolutionary metaphor. Our intention is neither to offer a new “evidence based management fad” for practitioners, nor to suggest research gap to scholars. Rather, we want to open an assumption-challenging conversation with regards to main stream approaches (neo-classical economics and organization theory), turning “our eyes away from the blinding light of eternal certitude towards the refracted world of turbid finitude” (Long, 2002, p. 44) generating what Bernstein has named “Cartesian Anxiety” (Bernstein, 1983, p. 18), and revisit the conceptualization of maturity and maturity models. We rely on conventions theory and a systemic-discursive perspective. These two lenses have both information & communication and self-producing systems as common threads. Furthermore the narrative approach is well suited to explore complex way of thinking about organizational phenomena as complex systems. This approach is relevant with our object of curiosity, i.e. the concept of maturity and maturity models, as maturity models (as standards) are discourses and systems of regulations. The main contribution of this conversation is that we suggest moving from a neo-classical “theory of the game” aiming at making the complex world simpler in playing the game, to a “theory of the rules of the game”, aiming at influencing and challenging the rules of the game constitutive of maturity models – conventions, governing systems – making compatible individual calculation and social context, and possible the coordination of relationships and cooperation between agents with or potentially divergent interests and values. A second contribution is the reconceptualization of maturity as structural coupling between conventions, rather than as an independent variable leading to organizational performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As Earth's climate is rapidly changing, the impact of ambient temperature on health outcomes has attracted increasing attention in the recent time. Considerable number of excess deaths has been reported because of exposure to ambient hot and cold temperatures. However, relatively little research has been conducted on the relation between temperature and morbidity. The aim of this study was to characterize the relationship between both hot and cold temperatures and emergency hospital admissions in Brisbane, Australia, and to examine whether the relation varied by age and socioeconomic factors. It aimed to explore lag structures of temperature–morbidity association for respiratory causes, and to estimate the magnitude of emergency hospital admissions for cardiovascular diseases attributable to hot and cold temperatures for the large contribution of both diseases to the total emergency hospital admissions. A time series study design was applied using routinely collected data of daily emergency hospital admissions, weather and air pollution variables in Brisbane during 1996–2005. Poisson regression model with a distributed lag non-linear structure was adopted to assess the impact of temperature on emergency hospital admissions after adjustment for confounding factors. Both hot and cold effects were found, with higher risk of hot temperatures than that of cold temperatures. Increases in mean temperature above 24.2oC were associated with increased morbidity, especially for the elderly ≥ 75 years old with the largest effect. The magnitude of the risk estimates of hot temperature varied by age and socioeconomic factors. High population density, low household income, and unemployment appeared to modify the temperature–morbidity relation. There were different lag structures for hot and cold temperatures, with the acute hot effect within 3 days after hot exposure and about 2-week lagged cold effect on respiratory diseases. A strong harvesting effect after 3 days was evident for respiratory diseases. People suffering from cardiovascular diseases were found to be more vulnerable to hot temperatures than cold temperatures. However, more patients admitted for cardiovascular diseases were attributable to cold temperatures in Brisbane compared with hot temperatures. This study contributes to the knowledge base about the association between temperature and morbidity. It is vitally important in the context of ongoing climate change. The findings of this study may provide useful information for the development and implementation of public health policy and strategic initiatives designed to reduce and prevent the burden of disease due to the impact of climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It was widely anticipated that after the introduction of silicone hydrogel lenses, the risk of microbial keratitis would be lower than with hydrogel lenses because of the reduction in hypoxic effects on the corneal epithelium. Large-scale epidemiological studies have confirmed that the absolute and relative risk of microbial keratitis is unchanged with overnight use of silicone hydrogel materials. The key findings include the following: (1) The risk of infection with 30 nights of silicone hydrogel use is equivalent to 6 nights of hydrogel extended wear; (2) Occasional overnight lens use is associated with a greater risk than daily lens use; (3) The rate of vision loss due to corneal infection with silicone hydrogel contact lenses is similar to that seen in hydrogel lenses; (4) The spectrum of causative organisms is similar to that seen in hydrogel lenses, and the material type does not impact the corneal location of presumed microbial keratitis; and (5) Modifiable risk factors for infection include overnight lens use, the degree of exposure, failing to wash hands before lens handling, and storage case hygiene practice. The lack of change in the absolute risk of disease would suggest that exposure to large number of pathogenic organisms can overcome any advantages obtained from eliminating the hypoxic effects of contact lenses. Epidemiological studies remain important in the assessment of new materials and modalities. Consideration of an early adopter effect with studies involving new materials and modalities and further investigation of the impact of second-generation silicone hydrogel materials is warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transport processes within heterogeneous media may exhibit non- classical diffusion or dispersion which is not adequately described by the classical theory of Brownian motion and Fick’s law. We consider a space-fractional advection-dispersion equation based on a fractional Fick’s law. Zhang et al. [Water Resources Research, 43(5)(2007)] considered such an equation with variable coefficients, which they dis- cretised using the finite difference method proposed by Meerschaert and Tadjeran [Journal of Computational and Applied Mathematics, 172(1):65-77 (2004)]. For this method the presence of variable coef- ficients necessitates applying the product rule before discretising the Riemann–Liouville fractional derivatives using standard and shifted Gru ̈nwald formulas, depending on the fractional order. As an alternative, we propose using a finite volume method that deals directly with the equation in conservative form. Fractionally-shifted Gru ̈nwald formulas are used to discretise the Riemann–Liouville fractional derivatives at control volume faces, eliminating the need for product rule expansions. We compare the two methods for several case studies, highlighting the convenience of the finite volume approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Port-Hamiltonian Systems (PHS) have a particular form that incorporates explicitly a function of the total energy in the system (energy function) and also other functions that describe structure of the system in terms of energy distribution. For PHS, the product of the input and output variables gives the rate of energy change. This type of systems have the property that under certain conditions on the energy function, the system is passive; and thus, stable. Therefore, if one can design a controller such that the closed-loop system retains - or takes - a PHS form, such closed-loop system will inherit the properties of passivity and stability. In this paper, the classical model of marine craft is put into a PHS form. It is shown that models used for positioning control do not have a PHS form due to a kinematic transformation, but a control design can be done such that the closed-loop system takes a PHS form. It is further shown how integral action can be added and how the PHS-form can be exploited to provide a procedure for control design that ensures passivity and thus stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cotton strip assay (CSA) is an established technique for measuring soil microbial activity. The technique involves burying cotton strips and measuring their tensile strength after a certain time. This gives a measure of the rotting rate, R, of the cotton strips. R is then a measure of soil microbial activity. This paper examines properties of the technique and indicates how the assay can be optimised. Humidity conditioning of the cotton strips before measuring their tensile strength reduced the within and between day variance and enabled the distribution of the tensile strength measurements to approximate normality. The test data came from a three-way factorial experiment (two soils, two temperatures, three moisture levels). The cotton strips were buried in the soil for intervals of time ranging up to 6 weeks. This enabled the rate of loss of cotton tensile strength with time to be studied under a range of conditions. An inverse cubic model accounted for greater than 90% of the total variation within each treatment combination. This offers support for summarising the decomposition process by a single parameter R. The approximate variance of the decomposition rate was estimated from a function incorporating the variance of tensile strength and the differential of the function for the rate of decomposition, R, with respect to tensile strength. This variance function has a minimum when the measured strength is approximately 2/3 that of the original strength. The estimates of R are almost unbiased and relatively robust against the cotton strips being left in the soil for more or less than the optimal time. We conclude that the rotting rate X should be measured using the inverse cubic equation, and that the cotton strips should be left in the soil until their strength has been reduced to about 2/3.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several recently proposed ciphers, for example Rijndael and Serpent, are built with layers of small S-boxes interconnected by linear key-dependent layers. Their security relies on the fact, that the classical methods of cryptanalysis (e.g. linear or differential attacks) are based on probabilistic characteristics, which makes their security grow exponentially with the number of rounds N r r. In this paper we study the security of such ciphers under an additional hypothesis: the S-box can be described by an overdefined system of algebraic equations (true with probability 1). We show that this is true for both Serpent (due to a small size of S-boxes) and Rijndael (due to unexpected algebraic properties). We study general methods known for solving overdefined systems of equations, such as XL from Eurocrypt’00, and show their inefficiency. Then we introduce a new method called XSL that uses the sparsity of the equations and their specific structure. The XSL attack uses only relations true with probability 1, and thus the security does not have to grow exponentially in the number of rounds. XSL has a parameter P, and from our estimations is seems that P should be a constant or grow very slowly with the number of rounds. The XSL attack would then be polynomial (or subexponential) in N r> , with a huge constant that is double-exponential in the size of the S-box. The exact complexity of such attacks is not known due to the redundant equations. Though the presented version of the XSL attack always gives always more than the exhaustive search for Rijndael, it seems to (marginally) break 256-bit Serpent. We suggest a new criterion for design of S-boxes in block ciphers: they should not be describable by a system of polynomial equations that is too small or too overdefined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose and evaluate a novel methodology to identify the rolling shutter parameters of a real camera. We also present a model for the geometric distortion introduced when a moving camera with a rolling shutter views a scene. Unlike previous work this model allows for arbitrary camera motion, including accelerations, is exact rather than a linearization and allows for arbitrary camera projection models, for example fisheye or panoramic. We show the significance of the errors introduced by a rolling shutter for typical robot vision problems such as structure from motion, visual odometry and pose estimation.