973 resultados para diagonal constrained decorrelation
Resumo:
Accurately calibrated effective field theories are used to compute atomic parity nonconserving (APNC) observables. Although accurately calibrated, these effective field theories predict a large spread in the neutron skin of heavy nuclei. Whereas the neutron skin is strongly correlated to numerous physical observables, in this contribution we focus on its impact on new physics through APNC observables. The addition of an isoscalar-isovector coupling constant to the effective Lagrangian generates a wide range of values for the neutron skin of heavy nuclei without compromising the success of the model in reproducing well-constrained nuclear observables. Earlier studies have suggested that the use of isotopic ratios of APNC observables may eliminate their sensitivity to atomic structure. This leaves nuclear structure uncertainties as the main impediment for identifying physics beyond the standard model. We establish that uncertainties in the neutron skin of heavy nuclei are at present too large to measure isotopic ratios to better than the 0.1% accuracy required to test the standard model. However, we argue that such uncertainties will be significantly reduced by the upcoming measurement of the neutron radius in 208^Pb at the Jefferson Laboratory.
Resumo:
Data caching can remarkably improve the efficiency of information access in a wireless ad hoc network by reducing the access latency and bandwidth usage. The cache placement problem minimizes total data access cost in ad hoc networks with multiple data items. The ad hoc networks are multi hop networks without a central base station and are resource constrained in terms of channel bandwidth and battery power. By data caching the communication cost can be reduced in terms of bandwidth as well as battery energy. As the network node has limited memory the problem of cache placement is a vital issue. This paper attempts to study the existing cooperative caching techniques and their suitability in mobile ad hoc networks.
Resumo:
Multiwall carbon nanotubes (MWCNTs) possessing an average inner diameter of 150 nm were synthesized by template assisted chemical vapor deposition over an alumina template. Aqueous ferrofluid based on superparamagnetic iron oxide nanoparticles (SPIONs) was prepared by a controlled co-precipitation technique, and this ferrofluid was used to fill the MWCNTs by nanocapillarity. The filling of nanotubes with iron oxide nanoparticles was confirmed by electron microscopy. Selected area electron diffraction indicated the presence of iron oxide and graphitic carbon from MWCNTs. The magnetic phase transition during cooling of the MWCNT–SPION composite was investigated by low temperature magnetization studies and zero field cooled (ZFC) and field cooled experiments. The ZFC curve exhibited a blocking at ∼110 K. A peculiar ferromagnetic ordering exhibited by the MWCNT–SPION composite above room temperature is because of the ferromagnetic interaction emanating from the clustering of superparamagnetic particles in the constrained volume of an MWCNT. This kind of MWCNT–SPION composite can be envisaged as a good agent for various biomedical applications
Resumo:
Unit Commitment Problem (UCP) in power system refers to the problem of determining the on/ off status of generating units that minimize the operating cost during a given time horizon. Since various system and generation constraints are to be satisfied while finding the optimum schedule, UCP turns to be a constrained optimization problem in power system scheduling. Numerical solutions developed are limited for small systems and heuristic methodologies find difficulty in handling stochastic cost functions associated with practical systems. This paper models Unit Commitment as a multi stage decision making task and an efficient Reinforcement Learning solution is formulated considering minimum up time /down time constraints. The correctness and efficiency of the developed solutions are verified for standard test systems
Resumo:
This study reports the details of the finite element analysis of eleven shear critical partially prestressed concrete T-beams having steel fibers over partial or full depth. Prestressed T-beams having a shear span to depth ratio of 2.65 and 1.59 that failed in shear have been analyzed using the ‘ANSYS’ program. The ‘ANSYS’ model accounts for the nonlinearity, such as, bond-slip of longitudinal reinforcement, postcracking tensile stiffness of the concrete, stress transfer across the cracked blocks of the concrete and load sustenance through the bridging action of steel fibers at crack interface. The concrete is modeled using ‘SOLID65’- eight-node brick element, which is capable of simulating the cracking and crushing behavior of brittle materials. The reinforcement such as deformed bars, prestressing wires and steel fibers have been modeled discretely using ‘LINK8’ – 3D spar element. The slip between the reinforcement (rebars, fibers) and the concrete has been modeled using a ‘COMBIN39’- nonlinear spring element connecting the nodes of the ‘LINK8’ element representing the reinforcement and nodes of the ‘SOLID65’ elements representing the concrete. The ‘ANSYS’ model correctly predicted the diagonal tension failure and shear compression failure of prestressed concrete beams observed in the experiment. The capability of the model to capture the critical crack regions, loads and deflections for various types of shear failures in prestressed concrete beam has been illustrated.
Resumo:
Die Dissertation leistet einen Beitrag zur Konstruktion von Geschlechterverhältnissen in der Ehe vor der Reformation. Untersucht werden zwei Texte des 15. Jahrhunderts: Die sogenannte Erste deutsche Bibel (EDB) und die Dichtung Der Ackermann aus Böhmen von Johannes von Tepl. Die Wort-für-Wort-Analyse beider Texte hat aus der zeitgebundenen deutschen Sprachgestalt spezifi-sche Denkfiguren erschlossen. Zentral ist die Gestaltung des Verhältnisses zwischen Gott und Menschen als Rechtsbeziehung Lehen. Das erste Kapitel analysiert und interpretiert die für Ehe- und Geschlechterkonzepte grund-legenden Aussagen von Genesis 1 - 4,1 und den Kanon der fünf Weisheitsbücher der EDB mit folgenden Ergebnissen: 1. Die EDB ist als Rechtsbuch zu lesen, dessen Zentrum die göttliche Herrschaft mit ihrer Rechts-ordnung ist. In diesem Rahmen wird für Mann und Frau die Ordnung Paar definiert. 2. Beide Geschlechter sind gleichwertig. Das kommt in ihren jeweiligen Handlungsfeldern (Werkordnungen) zum Ausdruck. 3. Der Begriff hilffen für die Ehefrau bedeutet Hilfe zum Heil ihres Mannes und eine spezifische Gottesnähe. Dies ist ein entscheidender Unterschied zu Luthers Bibelübersetzung, welche die Frau als untergeordnete Gehilfin des Mannes gestaltet. (Anhang III stellt die EDB-Verse der Lu-therübersetzung von 1545 gegenüber.) 4. Der Sündenfall wird als Rechtsbruch beschrieben, der den Wechsel von der Schöpfungsord-nung im Paradies zur Weltordnung auslöst, die mit den Urteilen Gottes über Frau und Mann beginnt (Genesis 3). Die Urteile fassen die Ehe in einem Diagonalkreuz, welches vier Faktoren untrennbar verbindet: Herrschaft mit Unterordnung sowie hilffen (Gebären) mit Tod. Die Ge-schlechterunterschiede sind konstituierend für die Ehe im Diagonalkreuz. In der EDB lassen sich drei Paarkonstellationen unterscheiden: die Ehe im Diagonalkreuz zwischen dem guten weib und dem weisen Mann, die Ehe ohne Diagonalkreuz zwischen dem un-weib und dem unweisen Mann und die nichteheliche Geschlechterbeziehung zwischen dem gemeinem weib und dem ee-brecher. Das zweite Kapitel der Dissertation vergleicht die alttestamentarischen Ehekonzepte der EDB mit denen der Dichtung Ackermann aus Böhmen. Die rhetorische Form des Streitgesprächs zwischen Witwer und hern Tot zeigt die Struktur eines deutschrechtlichen Prozesses. In diesem Rahmen gewinnen die gegensätzlichen Aussagen über die Ehe die Bedeutung von Rechtspositionen, die in konträren göttlichen Herrschaftsordnungen verortet werden. Die vom Witwer vertretene Herrschaftsordnung beruht auf der alttestamentarischen Rechtsordnung der EDB, während die Positionen des hern Tot diese Rechtsordnung verkehren, indem er die Herrschaftsordnung der Welt als sein Eigen definiert. Ein weiteres Ergebnis des Vergleichs zwischen der EDB und dem Ackermann aus Böhmen ist, dass sowohl die alttestamentarischen Bücher als auch die Dichtung Rechtsfiguren präsentieren. Entscheidend sind in beiden Texten die Urteile Gottes, die jeweils einen Paradigmenwechsel kon-stituieren. In der EDB wird nach dem Sündenfall die (paradiesische) Rechtsordnung in die Rechtsordnung ee gewandelt. Im Ackermann wird die alttestamentarische Rechtsordnung ee der EDB durch die ordenung Tod ersetzt, mit der zugleich die Ordnung Paar nicht mehr gilt. Die Urtei-le Gottes in der EDB definieren das Paar als zweigeschlechtlichen Menschen, das Urteil im Ackermann charakterisiert zwei Einzelmenschen: Mann oder Frau. Damit wird die zentrale Bedeutung der Ehefrau als hilffen zum Heil ihres Mannes aufgehoben, weil die wechselseitige Angewiesenheit von Mann und Frau nicht mehr gegeben ist. Insofern ist hier ein wichtiger Schritt zum reformatorischen Eheverständnis zu erkennen.
Resumo:
Cubicle should provide good resting comfort as well as clean udders. Dairy cows in cubicle houses often face a restrictive environment with regard to resting behaviour, whereas cleanliness may still be impaired. This study aimed to determine reliable behavioural measures regarding resting comfort applicable in on-farm welfare assessments. Furthermore, relationships between cubicle design, cow sizes, management factors and udder cleanliness (namely teats and teat tips) were investigated. Altogether 15 resting measures were examined in terms of feasibility, inter-observer reliability (IOR) and consistency of results per farm over time. They were recorded during three farm visits on farms in Germany and Austria with cubicle, deep litter and tie stall systems. Seven measures occurred to infrequently to allow reliable recording within a limited observation time. IOR was generally acceptable to excellent except for 'collisions during lying down', which only showed good IOR after improvement of the definition. Only three measures were acceptably repeatable over time: 'duration of lying down', 'percentage of collisions during lying down' and 'percentage of cows lying partly or completely outside lying area'. These measures were evaluated as suitable animal based welfare measures regarding resting behaviour in the framework of an on-farm welfare assessment protocol. The second part of the thesis comprises a cross-sectional study on resting comfort and cow cleanliness including 23 Holstein Friesian dairy herds with very low within-farm variation in cubicle measures. Height at withers, shoulder width and diagonal body length were measured in 79-100 % of the cows (herd size 30 to115 cows). Based on the 25 % largest animals, compliance with recommendations for cubicle measures was calculated. Cleanliness of different body parts, the udder, teats and teat tips was assessed for each cow in the herd prior to morning milking. No significant correlation was found between udder soiling and teat or teat tip soiling on herd level. The final model of a stepwise regression regarding the percentage of dirty teats per farm explained 58.5 % the variance and contained four factors. Teat dipping after milking which might be associated with an overall clean and accurate management style, deep bedded cubicles, increasing cubicle maintenance times and decreasing compliance concerning total cubicle length predicted lower teat soiling. The final model concerning teat tip soiling explained 46.0 % of the variance and contained three factors. Increasing litter height in the rear part of the cubicle and increased alley soiling which is difficult to explain, predicted for less soiled teat tips, whereas increasing compliance concerning resting length was associated with higher percentages of dirty teat tips. The dependent variable ‘duration of lying down’ was analysed using again stepwise regression. The final model explained 54.8 % of the total variance. Lying down duration was significantly shorter in deep bedded cubicles. Further explanatory though not significant factors in the model were neck-rail height, deep bedding or comfort mattresses versus concrete floor or rubber mats and clearance height of side partitions. In the attempt to create a more comprehensive lying down measure, another analysis was carried out with percentage of ‘impaired lying down’ (i.e. events exceeding 6.3 seconds, with collisions or being interrupted) as dependent variable. The explanatory value of this final model was 41.3 %. An increase in partition length, in compliance concerning cubicle width and the presence of straw within bedding predicted a lower proportion of impaired lying down. The effect of partition length is difficult to interpret, but partition length and height were positively correlated on the study farms, possibly leading to a bigger zone of clear space for pelvis freedom. No associations could be found between impaired lying down and teat or teat tip soiling. Altogether, in agreement with earlier studies it was found that cubicle dimensions in practice are often inadequate with regard to the body dimensions of the cows, leading to high proportions of impaired lying down behaviour, whereas teat cleanliness is still unsatisfactory. Connections between cleanliness and cow comfort are far from simplistic. Especially the relationship between cubicle characteristics and lying down behaviour apparently is very complex, so that it is difficult to identify single influential factors that are valid for all farm situations. However, based on the results of the present study the use of deep bedded cubicles can be recommended as well as improved management with special regard to cubicle and litter maintenance in order to achieve both better resting comfort and teat cleanliness.
Resumo:
In the vision of Mark Weiser on ubiquitous computing, computers are disappearing from the focus of the users and are seamlessly interacting with other computers and users in order to provide information and services. This shift of computers away from direct computer interaction requires another way of applications to interact without bothering the user. Context is the information which can be used to characterize the situation of persons, locations, or other objects relevant for the applications. Context-aware applications are capable of monitoring and exploiting knowledge about external operating conditions. These applications can adapt their behaviour based on the retrieved information and thus to replace (at least a certain amount) the missing user interactions. Context awareness can be assumed to be an important ingredient for applications in ubiquitous computing environments. However, context management in ubiquitous computing environments must reflect the specific characteristics of these environments, for example distribution, mobility, resource-constrained devices, and heterogeneity of context sources. Modern mobile devices are equipped with fast processors, sufficient memory, and with several sensors, like Global Positioning System (GPS) sensor, light sensor, or accelerometer. Since many applications in ubiquitous computing environments can exploit context information for enhancing their service to the user, these devices are highly useful for context-aware applications in ubiquitous computing environments. Additionally, context reasoners and external context providers can be incorporated. It is possible that several context sensors, reasoners and context providers offer the same type of information. However, the information providers can differ in quality levels (e.g. accuracy), representations (e.g. position represented in coordinates and as an address) of the offered information, and costs (like battery consumption) for providing the information. In order to simplify the development of context-aware applications, the developers should be able to transparently access context information without bothering with underlying context accessing techniques and distribution aspects. They should rather be able to express which kind of information they require, which quality criteria this information should fulfil, and how much the provision of this information should cost (not only monetary cost but also energy or performance usage). For this purpose, application developers as well as developers of context providers need a common language and vocabulary to specify which information they require respectively they provide. These descriptions respectively criteria have to be matched. For a matching of these descriptions, it is likely that a transformation of the provided information is needed to fulfil the criteria of the context-aware application. As it is possible that more than one provider fulfils the criteria, a selection process is required. In this process the system has to trade off the provided quality of context and required costs of the context provider against the quality of context requested by the context consumer. This selection allows to turn on context sources only if required. Explicitly selecting context services and thereby dynamically activating and deactivating the local context provider has the advantage that also the resource consumption is reduced as especially unused context sensors are deactivated. One promising solution is a middleware providing appropriate support in consideration of the principles of service-oriented computing like loose coupling, abstraction, reusability, or discoverability of context providers. This allows us to abstract context sensors, context reasoners and also external context providers as context services. In this thesis we present our solution consisting of a context model and ontology, a context offer and query language, a comprehensive matching and mediation process and a selection service. Especially the matching and mediation process and the selection service differ from the existing works. The matching and mediation process allows an autonomous establishment of mediation processes in order to transfer information from an offered representation into a requested representation. In difference to other approaches, the selection service selects not only a service for a service request, it rather selects a set of services in order to fulfil all requests which also facilitates the sharing of services. The approach is extensively reviewed regarding the different requirements and a set of demonstrators shows its usability in real-world scenarios.
Resumo:
In a household or nations production system, social capital has been recognized as an input having major implications for project design as well as policy development. Using a structured questionnaire, household level data was obtained from a representative sample of 300 rural households in Msinga, KwaZulu-Natal. This study employed the conventional household economic behaviour model under constrained utility maximisation to examine the effect of social capital on the welfare of household, testing the hypothesis that the possession of social capital improves household welfare. The result shows that social capital endowments have a statistically significant positive effect on household welfare, in addition to the some household’s demographic and socio-economic characteristics. The study concluded that, access to social capital among other factors, is very crucial for improved rural household welfare and poverty reduction. It is therefore important for government to have knowledge of existing social groups and networks as this will improve the effectiveness of the present strategies aimed at reducing poverty.
Resumo:
We are currently at the cusp of a revolution in quantum technology that relies not just on the passive use of quantum effects, but on their active control. At the forefront of this revolution is the implementation of a quantum computer. Encoding information in quantum states as “qubits” allows to use entanglement and quantum superposition to perform calculations that are infeasible on classical computers. The fundamental challenge in the realization of quantum computers is to avoid decoherence – the loss of quantum properties – due to unwanted interaction with the environment. This thesis addresses the problem of implementing entangling two-qubit quantum gates that are robust with respect to both decoherence and classical noise. It covers three aspects: the use of efficient numerical tools for the simulation and optimal control of open and closed quantum systems, the role of advanced optimization functionals in facilitating robustness, and the application of these techniques to two of the leading implementations of quantum computation, trapped atoms and superconducting circuits. After a review of the theoretical and numerical foundations, the central part of the thesis starts with the idea of using ensemble optimization to achieve robustness with respect to both classical fluctuations in the system parameters, and decoherence. For the example of a controlled phasegate implemented with trapped Rydberg atoms, this approach is demonstrated to yield a gate that is at least one order of magnitude more robust than the best known analytic scheme. Moreover this robustness is maintained even for gate durations significantly shorter than those obtained in the analytic scheme. Superconducting circuits are a particularly promising architecture for the implementation of a quantum computer. Their flexibility is demonstrated by performing optimizations for both diagonal and non-diagonal quantum gates. In order to achieve robustness with respect to decoherence, it is essential to implement quantum gates in the shortest possible amount of time. This may be facilitated by using an optimization functional that targets an arbitrary perfect entangler, based on a geometric theory of two-qubit gates. For the example of superconducting qubits, it is shown that this approach leads to significantly shorter gate durations, higher fidelities, and faster convergence than the optimization towards specific two-qubit gates. Performing optimization in Liouville space in order to properly take into account decoherence poses significant numerical challenges, as the dimension scales quadratically compared to Hilbert space. However, it can be shown that for a unitary target, the optimization only requires propagation of at most three states, instead of a full basis of Liouville space. Both for the example of trapped Rydberg atoms, and for superconducting qubits, the successful optimization of quantum gates is demonstrated, at a significantly reduced numerical cost than was previously thought possible. Together, the results of this thesis point towards a comprehensive framework for the optimization of robust quantum gates, paving the way for the future realization of quantum computers.
Resumo:
Artisanal columbite-tantalite (coltan) mining has had negative effects on the rural economy in the great Lakes region of Africa through labor deficits, degradation and loss of farmland, food insecurity, high cost of living, and reduced traditional export crop production alongside secondary impacts that remotely affect the quality of air, water, soil, plants, animals, and human wellbeing. The situation is multifaceted and calls for a holistic approach for short and long-term mitigation of such negative effects. This study focuses on the effects of mine land restoration on soil microbiological quality in the Gatumba Mining District of western Rwanda. Some coltan mine wastelands were afforested with pine and eucalyptus trees while farmers directly cultivated others due to land scarcity. Farmyard manure (FYM) is the sole fertilizer applied on the wastelands although it is insufficient to achieve the desired crop yields. Despite this, several multi-purpose plants such as Tithonia diversifolia, Markhamia lutea, and Canavalia brasiliensis thrive in the area and could supplement FYM. The potential for these “new” amendments to improve soil microbial properties, particularly in the tantalite mine soils was investigated. The specific objectives of the study were to: (a) evaluate the effects of land use on soil microbial indices of the tantalite mine soils; (b) investigate the restorative effects of organic amendments on a Technosol; and (c) estimate the short-term N and P supply potential of the soil amendments in the soils. Fresh soils (0-20 cm) from an unmined native forest, two mine sites afforested with pine and eucalyptus forests (pine and eucalyptus Technosols), an arable land, and two cultivated Technosols (Kavumu and Kirengo Technosols) were analyzed for the physicochemical properties. Afterwards, a 28-day incubation (22oC) experiment was conducted followed by measurements of mineral N, soil microbial biomass C, N, P, and fungal ergosterol contents using standard methods. This was followed by a 12-week incubation study of the arable soil and the Kavumu Technosol amended with FYM, Canavalia and Tithonia biomass, and Markhamia leaf litter after which soil microbial properties were measured at 2, 8, and 12 weeks of incubation. Finally, two 4-week incubation experiments each were conducted in soils of the six sites to estimate (i) potential mineralizable N using a soil-sand mixture (1:1) amended with Canavalia and goat manure and (ii) P mineralization mixtures (1:1) of soil and anion exchange resins in bicarbonate form amended with Tithonia biomass and goat manure. In study one, afforestation increased soil organic carbon and total N contents in the pine and eucalyptus Technosols by 34-40% and 28-30%, respectively of that in the native forest soil. Consequently, the microbial biomass and activity followed a similar trend where the cultivated Technosols were inferior to the afforested ones. The microbial indices of the mine soils were constrained by soil acidity, dithionite-extractable Al, and low P availability. In study two, the amendments substantially increased C and N mineralization, microbial properties compared with non-amended soils. Canavalia biomass increased CO2 efflux by 340%, net N mineralization by 30-140%, and microbial biomass C and N by 240-600% and 240-380% (P < 0.01), respectively after four weeks of incubation compared with the non-amended soils. Tithonia biomass increased ergosterol content by roughly 240%. The Kavumu Technosol showed a high potential for quick restoration of its soil quality due to its major responses to the measured biological parameters. In study three, Canavalia biomass gave the highest mineralizable N (130 µg g-1 soil, P < 0.01) in the Kavumu Technosol and the lowest in the native forest soil (-20 µg g-1 soil). Conversely, the mineralizable N of goat manure was negative in all soils ranging from -2.5 µg N g-1 to -7.7 µg N g-1 soil except the native forest soil. However, the immobilization of goat manure N in the “cultivated soils” was 30-70% lower than in the “forest soils” signifying an imminent recovery of the amended soils from N immobilization. The mineralization of goat manure P was three-fold that of Tithonia, constituting 61-71% of total P applied. Phosphorus mineralization slightly decreased after four weeks of incubation due to sulfate competition as reflected in a negative correlation, which was steeper in the Tithonia treatment. In conclusion, each amendment used in this research played a unique role in C, N, and P mineralization and contributed substantially to microbial properties in the tantalite mine soils. Interestingly, the “N immobilizers” exhibited potentials for P release and soil organic carbon storage. Consequently, the combined use of the amendments in specific ratios, or co-composting prior to application is recommended to optimize nutrient release, microbial biomass dynamics and soil organic matter accrual. Transport of organic inputs seems more feasible for smallholder farmers who typically manage small field sizes. To reduce acidity in the soils, liming with wood ash was recommended to also improve P availability and enhance soil biological quality, even if it may only be possible on small areas. Further, afforestation with mixed-species of fast-growing eucalyptus and legume or indigenous tree species are suggested to restore tantalite mine wastelands. It is emphasized most of this research was conducted under controlled laboratory conditions, which exclude interaction with environmental variables. Also fine fractions of the amendments were used compared with the usual practice of applying a mixture of predominantly coarser fractions. Therefore, the biological dynamics reported in the studies here may not entirely reflect those of farmers’ field conditions.
Resumo:
A foundational model of concurrency is developed in this thesis. We examine issues in the design of parallel systems and show why the actor model is suitable for exploiting large-scale parallelism. Concurrency in actors is constrained only by the availability of hardware resources and by the logical dependence inherent in the computation. Unlike dataflow and functional programming, however, actors are dynamically reconfigurable and can model shared resources with changing local state. Concurrency is spawned in actors using asynchronous message-passing, pipelining, and the dynamic creation of actors. This thesis deals with some central issues in distributed computing. Specifically, problems of divergence and deadlock are addressed. For example, actors permit dynamic deadlock detection and removal. The problem of divergence is contained because independent transactions can execute concurrently and potentially infinite processes are nevertheless available for interaction.
Resumo:
This thesis addresses the problem of categorizing natural objects. To provide a criteria for categorization we propose that the purpose of a categorization is to support the inference of unobserved properties of objects from the observed properties. Because no such set of categories can be constructed in an arbitrary world, we present the Principle of Natural Modes as a claim about the structure of the world. We first define an evaluation function that measures how well a set of categories supports the inference goals of the observer. Entropy measures for property uncertainty and category uncertainty are combined through a free parameter that reflects the goals of the observer. Natural categorizations are shown to be those that are stable with respect to this free parameter. The evaluation function is tested in the domain of leaves and is found to be sensitive to the structure of the natural categories corresponding to the different species. We next develop a categorization paradigm that utilizes the categorization evaluation function in recovering natural categories. A statistical hypothesis generation algorithm is presented that is shown to be an effective categorization procedure. Examples drawn from several natural domains are presented, including data known to be a difficult test case for numerical categorization techniques. We next extend the categorization paradigm such that multiple levels of natural categories are recovered; by means of recursively invoking the categorization procedure both the genera and species are recovered in a population of anaerobic bacteria. Finally, a method is presented for evaluating the utility of features in recovering natural categories. This method also provides a mechanism for determining which features are constrained by the different processes present in a multiple modal world.
Resumo:
Since robots are typically designed with an individual actuator at each joint, the control of these systems is often difficult and non-intuitive. This thesis explains a more intuitive control scheme called Virtual Model Control. This thesis also demonstrates the simplicity and ease of this control method by using it to control a simulated walking hexapod. Virtual Model Control uses imagined mechanical components to create virtual forces, which are applied through the joint torques of real actuators. This method produces a straightforward means of controlling joint torques to produce a desired robot behavior. Due to the intuitive nature of this control scheme, the design of a virtual model controller is similar to the design of a controller with basic mechanical components. The ease of this control scheme facilitates the use of a high level control system which can be used above the low level virtual model controllers to modulate the parameters of the imaginary mechanical components. In order to apply Virtual Model Control to parallel mechanisms, a solution to the force distribution problem is required. This thesis uses an extension of Gardner`s Partitioned Force Control method which allows for the specification of constrained degrees of freedom. This virtual model control technique was applied to a simulated hexapod robot. Although the hexapod is a highly non-linear, parallel mechanism, the virtual models allowed text-book control solutions to be used while the robot was walking. Using a simple linear control law, the robot walked while simultaneously balancing a pendulum and tracking an object.
Resumo:
One objective of artificial intelligence is to model the behavior of an intelligent agent interacting with its environment. The environment's transformations can be modeled as a Markov chain, whose state is partially observable to the agent and affected by its actions; such processes are known as partially observable Markov decision processes (POMDPs). While the environment's dynamics are assumed to obey certain rules, the agent does not know them and must learn. In this dissertation we focus on the agent's adaptation as captured by the reinforcement learning framework. This means learning a policy---a mapping of observations into actions---based on feedback from the environment. The learning can be viewed as browsing a set of policies while evaluating them by trial through interaction with the environment. The set of policies is constrained by the architecture of the agent's controller. POMDPs require a controller to have a memory. We investigate controllers with memory, including controllers with external memory, finite state controllers and distributed controllers for multi-agent systems. For these various controllers we work out the details of the algorithms which learn by ascending the gradient of expected cumulative reinforcement. Building on statistical learning theory and experiment design theory, a policy evaluation algorithm is developed for the case of experience re-use. We address the question of sufficient experience for uniform convergence of policy evaluation and obtain sample complexity bounds for various estimators. Finally, we demonstrate the performance of the proposed algorithms on several domains, the most complex of which is simulated adaptive packet routing in a telecommunication network.