893 resultados para Logic, Symbolic and mathematical.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
The main objective of this thesis is to design and develop spectral signature based chipless RFID tags Multiresonators are essential component of spectral signature based chipless tags. To enhance the data coding capacity in spectral signature based tags require large number of resonances in a limited bandwidth. The frequency of the resonators have to be close to each other. To achieve this condition, the quality factor of each resonance needs to be high. The thesis discusses about various types of multiresonators, their practical implementation and how they can be used in design. Encoding of data into spectral domain is another challenge in chipless tag design. Here, the technique used is the presence or absence encoding technique. The presence of a resonance is used to encode Logic 1 and absence of a speci c resonance is used to encode Logic 0. Di erent types of multiresonators such as open stub multiresonators, coupled bunch hairpin resonators and shorted slot ground ring resonator are proposed in this thesis.
Resumo:
Type and rate of fertilizers influence the level of soil organic carbon (Corg) and total nitrogen (Nt) markedly, but the effect on C and N partitioning into different pools is open to question. The objectives of the present work were to: (i) quantify the impact of fertilizer type and rate on labile, intermediate and passive C and N pools by using a combination of biological, chemical and mathematical methods; (ii) explain previously reported differences in the soil organic matter (SOM) levels between soils receiving farmyard manure with or without biodynamic preparations by using Corg time series and information on SOM partitioning; and (iii) quantify the long-term and short-term dynamics of SOM in density fractions and microbial biomass as affected by fertilizer type and rate and determine the incorporation of crop residues into labile SOM fractions. Samples were taken from a sandy Cambisol from the long-term fertilization trial in Darmstadt, Germany, founded in 1980. The nine treatments (four field replicates) were: straw incorporation plus application of mineral fertilizer (MSI) and application of rotted farmyard manure with (DYN) or without (FYM) addition of biodynamic preparations, each at high (140 – 150 kg N ha-1 year-1; MSIH, DYNH, FYMH), medium (100 kg N ha-1 year-1; MSIM, DYNM, FYMM) and low (50 – 60 kg N ha-1 year-1; MSIL, DYNL, FYML) rates. The main findings were: (i) The stocks of Corg (t ha-1) were affected by fertilizer type and rate and increased in the order MSIL (23.6), MSIM (23.7), MSIH (24.2) < FYML (25.3) < FYMM (28.1), FYMH (28.1). Stocks of Nt were affected in the same way (C/N ratio: 11). Storage of C and N in the modelled labile pools (turnover times: 462 and 153 days for C and N, respectively) were not influenced by the type of fertilizer (FYM and MSI) but depended significantly (p ≤ 0.05) on the application rate and ranged from 1.8 to 3.2 t C ha 1 (7 – 13% of Corg) and from 90 to 140 kg N ha-1 (4-5% of Nt). In the calculated intermediate pool (C/N ratio 7), stocks of C were markedly higher in FYM treatments (15-18 t ha-1) compared to MSI treatments (12-14 t ha-1). This showed that differences in SOM stocks in the sandy Cambisol induced by fertilizer rate may be short-lived in case of changing management, but differences induced by fertilizer type may persist for decades. (ii) Crop yields, estimated C inputs (1.5 t ha-1 year-1) with crop residue, microbial bio¬mass C (Cmic, 118 – 150 mg kg-1), microbial biomass N (17 – 20 mg kg-1) and labile C and N pools did not differ significantly between FYM and DYN treatments. However, labile C increased linearly with application rate (R2 = 0.53) from 7 to 11% of Corg. This also applied for labile N (3.5 to 4.9% of Nt). The higher contents of Corg in DYN treatments existed since 1982, when the first sampling was conducted for all individual treatments. Contents of Corg between DYN and FYM treatments con-verged slightly since then. Furthermore, at least 30% of the difference in Corg was located in the passive pool where a treatment effect could be excluded. Therefore, the reported differences in Corg contents existed most likely since the beginning of the experiment and, as a single factor of biodynamic agriculture, application of bio-dynamic preparations had no effect on SOM stocks. (iii) Stocks of SOM, light fraction organic C (LFOC, ρ ≤ 2.0 g cm-3), light fraction organic N and Cmic decreased in the order FYMH > FYML > MSIH, MSIL for all sampling dates in 2008 (March, May, September, December). However, statistical significance of treatment effects differed between the dates, probably due to dif-ferences in the spatial variation throughout the year. The high proportion of LFOC on total Corg stocks (45 – 55%) highlighted the importance of selective preservation of OM as a stabilization mechanism in this sandy Cambisol. The apparent turnover time of LFOC was between 21 and 32 years, which agreed very well with studies with substantially longer vegetation change compared to our study. Overall, both approaches; (I) the combination of incubation, chemical fractionation and simple modelling and (II) the density fractionation; provided complementary information on the partitioning of SOM into pools of different stability. The density fractionation showed that differences in Corg stocks between FYM and MSI treatments were mainly located in the light fraction, i.e. induced by higher recalcitrance of the organic input in the FYM treatments. Moreover, the use of the combination of biological, chemical and mathematical methods indicated that effects of fertilizer rate on total Corg and Nt stocks may be short-lived, but that the effect of fertilizer type may persist for longer time spans in the sandy Cambisol.
Resumo:
A través de un caso de estudio se explora cómo la construcción de sentido de un grupo de directivos, bajo una misma inspiración, generó el inicio de un cambio estratégico en una prestigiosa y reconocida universidad colombiana, la Universidad del Rosario. Una institución que en un momento determinado notó que estaba siendo percibida dentro del sector de la educación superior como pequeña, estática en el avance de algunas disciplinas del conocimiento y conservadora; en otras palabras, que estaba perdiendo el reconocimiento que usualmente la había acompañado. A través del estudio de este caso se utilizó la técnica de análisis de discurso para comprender la construcción de sentido del inicio de un cambio estratégico en las organizaciones. Esta técnica permitió analizar la información cualitativa derivada de las entrevistas que se realizaron en profundidad a la cúpula de directivos de la institución y a algunos destacados representantes del sector de la Educación Superior en Colombia. Los resultados sugieren que se hicieron presentes, efectivamente, algunas condiciones específicas que marcaron el inicio de un cambio estratégico en la institución y un viraje en su identidad e imagen. Hechos que se sustentaron en los miembros de un equipo que procuró interpretar y comprender los cambios existentes en el entorno global y local, y asimilar, igualmente, algunos destacados retos que se planteaban por aquella época, al interior de la propia Universidad
Resumo:
This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.
Resumo:
From modernity to the contemporary world, museums have been acknowledged for their power to produce metamorphoses of meanings and functions, for their ability to adapt historic and social determination, and for their calling for cultural mediation. They derive from creating gestures which bind the symbolic and the material, which bind what is sensitive and what is intelligible. For this very reason the bridge metaphor fits them well, a bridge cast between different times, spaces, individuals, social groups and cultures, a bridge that is built with images and which holds a special place in the imaginary.
Resumo:
Most research on the discourses and practices of urban regeneration in-the UK has examined case studies located in areas of relative socio-economic distress. Less research has been undertaken on regeneration projects and agendas in areas characterise by strong economic growth. Yet, it is in such places that some of the best examples of the discourses, practices and impacts of contemporary urban regeneration can be. found. In some areas of high demand regeneration projects have used inner urban brownfield sites as locations for new investment. With the New Labour government's urban policy agendas targeting similar forms of regeneration, an examination of completed or on-going schemes is timely and relevant to debates over the direction that policy should take. This paper, drawing on a study of urban regeneration in one of England's, fastest growing towns, Reading in Berkshire, examines the discourses, practices and impacts of redevelopment schemes during the 1990s and 2000s. Reading's experiences have received national attention and have been hailed as a model for other urban areas to follow. The research documents the discursive and concrete aspects of local regeneration and examines the ways in which specific priorities and defined problems have come to dominate agendas. Collectively, the study argues that market-driven objectives come to dominate regeneration agendas, even in areas of strong demand where development agencies wield a relatively high degree of influence. Such regeneration plays a symbolic and practical role in creating new forms of exclusion and interpretations of place. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Current feed evaluation systems for dairy cattle aim to match nutrient requirements with nutrient intake at pre-defined production levels. These systems were not developed to address, and are not suitable to predict, the responses to dietary changes in terms of production level and product composition, excretion of nutrients to the environment, and nutrition related disorders. The change from a requirement to a response system to meet the needs of various stakeholders requires prediction of the profile of absorbed nutrients and its subsequent utilisation for various purposes. This contribution examines the challenges to predicting the profile of nutrients available for absorption in dairy cattle and provides guidelines for further improved prediction with regard to animal production responses and environmental pollution. The profile of nutrients available for absorption comprises volatile fatty acids, long-chain fatty acids, amino acids and glucose. Thus the importance of processes in the reticulo-rumen is obvious. Much research into rumen fermentation is aimed at determination of substrate degradation rates. Quantitative knowledge on rates of passage of nutrients out of the rumen is rather limited compared with that on degradation rates, and thus should be an important theme in future research. Current systems largely ignore microbial metabolic variation, and extant mechanistic models of rumen fermentation give only limited attention to explicit representation of microbial metabolic activity. Recent molecular techniques indicate that knowledge on the presence and activity of various microbial species is far from complete. Such techniques may give a wealth of information, but to include such findings in systems predicting the nutrient profile requires close collaboration between molecular scientists and mathematical modellers on interpreting and evaluating quantitative data. Protozoal metabolism is of particular interest here given the paucity of quantitative data. Empirical models lack the biological basis necessary to evaluate mitigation strategies to reduce excretion of waste, including nitrogen, phosphorus and methane. Such models may have little predictive value when comparing various feeding strategies. Examples include the Intergovernmental Panel on Climate Change (IPCC) Tier II models to quantify methane emissions and current protein evaluation systems to evaluate low protein diets to reduce nitrogen losses to the environment. Nutrient based mechanistic models can address such issues. Since environmental issues generally attract more funding from governmental offices, further development of nutrient based models may well take place within an environmental framework.
Resumo:
Similarities between the anatomies of living organisms are often used to draw conclusions regarding the ecology and behaviour of extinct animals. Several pterosaur taxa are postulated to have been skim-feeders based largely on supposed convergences of their jaw anatomy with that of the modern skimming bird, Rynchops spp. Using physical and mathematical models of Rynchops bills and pterosaur jaws, we show that skimming is considerably more energetically costly than previously thought for Rynchops and that pterosaurs weighing more than one kilogram would not have been able to skim at all. Furthermore, anatomical comparisons between the highly specialised skull of Rynchops and those of postulated skimming pterosaurs suggest that even smaller forms were poorly adapted for skim-feeding. Our results refute the hypothesis that some pterosaurs commonly used skimming as a foraging method and illustrate the pitfalls involved in extrapolating from limited morphological convergence.
Resumo:
Similarities between the anatomies of living organisms are often used to draw conclusions regarding the ecology and behaviour of extinct animals. Several pterosaur taxa are postulated to have been skim-feeders based largely on supposed convergences of their jaw anatomy with that of the modern skimming bird, Rynchops spp. Using physical and mathematical models of Rynchops bills and pterosaur jaws, we show that skimming is considerably more energetically costly than previously thought for Rynchops and that pterosaurs weighing more than one kilogram would not have been able to skim at all. Furthermore, anatomical comparisons between the highly specialised skull of Rynchops and those of postulated skimming pterosaurs suggest that even smaller forms were poorly adapted for skim-feeding. Our results refute the hypothesis that some pterosaurs commonly used skimming as a foraging method and illustrate the pitfalls involved in extrapolating from limited morphological convergence.
Resumo:
We introduce the perspex machine which unifies projective geometry and Turing computation and results in a supra-Turing machine. We show two ways in which the perspex machine unifies symbolic and non-symbolic AI. Firstly, we describe concrete geometrical models that map perspexes onto neural networks, some of which perform only symbolic operations. Secondly, we describe an abstract continuum of perspex logics that includes both symbolic logics and a new class of continuous logics. We argue that an axiom in symbolic logic can be the conclusion of a perspex theorem. That is, the atoms of symbolic logic can be the conclusions of sub-atomic theorems. We argue that perspex space can be mapped onto the spacetime of the universe we inhabit. This allows us to discuss how a robot might be conscious, feel, and have free will in a deterministic, or semi-deterministic, universe. We ground the reality of our universe in existence. On a theistic point, we argue that preordination and free will are compatible. On a theological point, we argue that it is not heretical for us to give robots free will. Finally, we give a pragmatic warning as to the double-edged risks of creating robots that do, or alternatively do not, have free will.
Resumo:
In this paper we explore the importance of emotionally inter-dependent relationships to the functioning of embodied social capital and habitus. Drawing upon the experiences of young people with socio-emotional differences, we demonstrate how emotionally inter-dependent and relatively nurturing relationships are integral to the acquisition of social capital and to the co-construction and embodiment of habitus. The young people presented in this paper often had difficulties in forging social relationships and in acquiring symbolic and cultural capital in school spaces. However, we outline how these young people (re)produce and embody alternative kinds of habitus, based on emotionally reciprocal relationships forged through formal and informal leisure activities and familial and fraternal social relationships. These alternative forms of habitus provide sites of subjection, scope for acquiring social and cultural capital and a positive sense of identity in the face of problematic relations and experiences in school spaces.
Resumo:
Iso-score curves graph (iSCG) and mathematical relationships between Scoring Parameters (SP) and Forecasting Parameters (FP) can be used in Economic Scoring Formulas (ESF) used in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. The various mathematical relationships and density distributions that describe the main SPs and FPs, and the representation of tendering data by means of iSCGs, enable the generation of two new types of graphs that can be very useful for bidders who want to be more competitive: the scoring and position probability graphs.