955 resultados para Specification Animation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An ongoing challenge for Learning Analytics research has been the scalable derivation of user interaction data from multiple technologies. The complexities associated with this challenge are increasing as educators embrace an ever growing number of social and content related technologies. The Experience API (xAPI) alongside the development of user specific record stores has been touted as a means to address this challenge, but a number of subtle considerations must be made when using xAPI in Learning Analytics. This paper provides a general overview to the complexities and challenges of using xAPI in a general systemic analytics solution - called the Connected Learning Analytics (CLA) toolkit. The importance of design is emphasised, as is the notion of common vocabularies and xAPI Recipes. Early decisions about vocabularies and structural relationships between statements can serve to either facilitate or handicap later analytics solutions. The CLA toolkit case study provides us with a way of examining both the strengths and the weaknesses of the current xAPI specification, and we conclude with a proposal for how xAPI might be improved by using JSON-LD to formalise Recipes in a machine readable form.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This talk gives an overview of the project "Uncanny Nature", which incoporates a style of animation called Hybrid Stop Motion, that combines physical object armatures with virtual copies. The development of the production pipeline (using a mix of Blender, Dragonframe, Photoscan and Arduino) is discussed, as well as the way that Blender was used throughout the production to visualise, model, animate and composite the elements together.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The session examines the role of the metaphysical and physical in art and animation and how this relates to natural spaces. Soviet Russian film director and theorist Sergei Eisenstein saw animation as possessing an ability called “plasmaticity”, the capacity for a being to assume any conceivable form dynamically. He saw each being as “primordial protoplasm, not yet possessing a ‘stable’ form, but capable of assuming any form” (Eisenstein 1989, 21). He was enamoured by the capacity of animation to transform and be liberated, of being able to escape from a fixed and static identity—to embody a "rejection of the once-­‐and-­‐forever allotted form" in which we are held (Eisenstein 1989, 21). Czech Surrealist animator Jan Švankmajer uses a metaphysical approach based on a belief in animism to art and animation. He believes that objects possess a conscious life or spirit, he says ‘Objects conceal within themselves the events they’ve witnessed. I don’t actually animate objects. I coerce their inner life out of them.’ (Švankmajer in Imre 2009, 214) In this animistic world there are no boundaries or rules, no physical or conceptual restrictions; anything is possible, with inanimate objects and places able to become animate and transact in a conscious relationship with humans and each other. This session invites artists, animators and theorists to discuss their conceptions and approaches to using visuals to promote and provoke transformation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses my current research which aims to re-member the site of the Peel Island Lazaret through re-imagining the Teerk Roo Ra forest as a series of animated artworks. Teerk Roo Ra National Park (formally known as Peel Island) is a small island in Moreton Bay, Queensland and is visible on the ferry journey from Cleveland to Stradbroke Island. The island has an intriguing history, and is the site of a former Lazaret and quarantine station. The Lazaret treated patients diagnosed with Hansen’s disease (or Leprosy), and operated between 1907 and 1959. In this paper I will discuss conceptions of the non-indigenous historical context of the Peel Island Lazaret and the notion of the liminal state (Turner,1967). Through this discussion conceptions of place from Australian cultural theorist Ross Gibson are also examined. The concept of two overlapping realms is then explored through the clues and shared stories about the people who inhabited the site. There is then an explanation of my own approach to re-member this place through re-imagining the forest that witnessed the events of the Lazaret. I then draw on theories of the uncanny from German Psychiatrist Ernst Jentsch, Austrian Neurologist Sigmund Freud and South African animation theorist Meg Rickards to argue that my experience of the forest of Teerk Roo Ra was an uncanny experience where two worlds or states of mind existed simultaneously and overlapped, causing a viscerally unsettling uncanny experience. Through an analysis of Czech Surrealist Animator Jan Švankmajer’s cinematic narrative Down to the cellar (1982), my creative work Structure #24(2011), and Australian Artist Patricia Piccinini’s cinematic artwork The Gathering (2007), I discuss the situation of the inanimate and the animate co-existing simultaneously. Using this approach I propose an understanding of the uncanny as an intellectual uncertainty as outlined by Jentsch (1906). I also develop the notion of the familiar being concealed and becoming unfamiliar through mimicry (Freud, 1919). These discussions form an introduction to my creative work Nocturne #5(2014) which re-members the forests of Teerk Roo Ra as an uncanny place primarily expressed through animation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chris Denaro is a Brisbane-based animator whose work incorporates a blend of physical stop motion and digital motion graphics. This exhibition, Nocturne, uses animation to embody the genius loci of the former Peel Island Lazaret on the island of Teerk Roo Ra in Moreton Bay, Queensland. This project developed a form of animation that harnesses animation’s plasmatic quality to express an in-between state of being, and examines the capacity of animation to push and pull at the boundary lines between what can be apprehended as the ‘real’ and the ‘imaginary’. The Nocturne constructions cycle forever, with no beginning and no end,only a slightly familiar hypnotic rhythm to describe a continual process of adaptation and renewal. These artworks consider the animation loop as a mental state, rather than a sequence of events which illustrate a narrative. The loop can also be an anxious, compulsive place, divorced from the linear nature of reality, hypnotised in a trance like repetition. Nocturne investigates how conceptions of place are overlaid by aspects of history, memory and the imagination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Uncanny nature, Merri Randell and Chris Denaro use animation and surreal photography to make us rethink what we understand about nature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current state of the practice in Blackspot Identification (BSI) utilizes safety performance functions based on total crash counts to identify transport system sites with potentially high crash risk. This paper postulates that total crash count variation over a transport network is a result of multiple distinct crash generating processes including geometric characteristics of the road, spatial features of the surrounding environment, and driver behaviour factors. However, these multiple sources are ignored in current modelling methodologies in both trying to explain or predict crash frequencies across sites. Instead, current practice employs models that imply that a single underlying crash generating process exists. The model mis-specification may lead to correlating crashes with the incorrect sources of contributing factors (e.g. concluding a crash is predominately caused by a geometric feature when it is a behavioural issue), which may ultimately lead to inefficient use of public funds and misidentification of true blackspots. This study aims to propose a latent class model consistent with a multiple crash process theory, and to investigate the influence this model has on correctly identifying crash blackspots. We first present the theoretical and corresponding methodological approach in which a Bayesian Latent Class (BLC) model is estimated assuming that crashes arise from two distinct risk generating processes including engineering and unobserved spatial factors. The Bayesian model is used to incorporate prior information about the contribution of each underlying process to the total crash count. The methodology is applied to the state-controlled roads in Queensland, Australia and the results are compared to an Empirical Bayesian Negative Binomial (EB-NB) model. A comparison of goodness of fit measures illustrates significantly improved performance of the proposed model compared to the NB model. The detection of blackspots was also improved when compared to the EB-NB model. In addition, modelling crashes as the result of two fundamentally separate underlying processes reveals more detailed information about unobserved crash causes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acetaminophen (paracetamol) is available in a wide range of oral formulations designed to meet the needs of the population across the age-spectrum, but for people with impaired swallowing, i.e. dysphagia, both solid and liquid medications can be difficult to swallow without modification. The effect of a commercial polysaccharide thickener, designed to be added to fluids to promote safe swallowing by dysphagic patients, on rheology and acetaminophen dissolution was tested using crushed immediate-release tablets in water, effervescent tablets in water, elixir and suspension. The inclusion of the thickener, comprised of xanthan gum and maltodextrin, had a considerable impact on dissolution; acetaminophen release from modified medications reached 12-50% in 30 minutes, which did not reflect the pharmacopeia specification for immediate release preparations. Flow curves reflect the high zero-shear viscosity and the apparent yield stress of the thickened products. The weak gel nature, in combination with high G’ values compared to G” (viscoelasticity) and high apparent yield stress, impact drug release. The restriction on drug release from these formulations is not influenced by the theoretical state of the drug (dissolved or dispersed), and the approach typically used in clinical practice (mixing crushed tablets into pre-prepared thickened fluid) cannot be improved by altering the order of incorporation or mixing method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reuse of existing carefully designed and tested software improves the quality of new software systems and reduces their development costs. Object-oriented frameworks provide an established means for software reuse on the levels of both architectural design and concrete implementation. Unfortunately, due to frame-works complexity that typically results from their flexibility and overall abstract nature, there are severe problems in using frameworks. Patterns are generally accepted as a convenient way of documenting frameworks and their reuse interfaces. In this thesis it is argued, however, that mere static documentation is not enough to solve the problems related to framework usage. Instead, proper interactive assistance tools are needed in order to enable system-atic framework-based software production. This thesis shows how patterns that document a framework s reuse interface can be represented as dependency graphs, and how dynamic lists of programming tasks can be generated from those graphs to assist the process of using a framework to build an application. This approach to framework specialization combines the ideas of framework cookbooks and task-oriented user interfaces. Tasks provide assistance in (1) cre-ating new code that complies with the framework reuse interface specification, (2) assuring the consistency between existing code and the specification, and (3) adjusting existing code to meet the terms of the specification. Besides illustrating how task-orientation can be applied in the context of using frameworks, this thesis describes a systematic methodology for modeling any framework reuse interface in terms of software patterns based on dependency graphs. The methodology shows how framework-specific reuse interface specifi-cations can be derived from a library of existing reusable pattern hierarchies. Since the methodology focuses on reusing patterns, it also alleviates the recog-nized problem of framework reuse interface specification becoming complicated and unmanageable for frameworks of realistic size. The ideas and methods proposed in this thesis have been tested through imple-menting a framework specialization tool called JavaFrames. JavaFrames uses role-based patterns that specify a reuse interface of a framework to guide frame-work specialization in a task-oriented manner. This thesis reports the results of cases studies in which JavaFrames and the hierarchical framework reuse inter-face modeling methodology were applied to the Struts web application frame-work and the JHotDraw drawing editor framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Creative Work as part of the Nocturne series

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Requirements engineering is an important phase in software development where customer's needs and expectations are transformed into a software requirements specification. The requirements specification can be considered as an agreement between the customer and the developer where both parties agree on the expected system features and behaviour. However, requirements engineers must deal with a variety of issues that complicate the requirements process. The communication gap between the customer and the developers is among typical reasons for unsatisfactory requirements. In this thesis we study how the use case technique could be used in requirements engineering in bridging the communication gap between the customer and development team. We also discuss how a use case description can be use cases can be used as a basis for acceptance test cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study focuses on the theory of individual rights that the German theologian Conrad Summenhart (1455-1502) explicated in his massive work Opus septipartitum de contractibus pro foro conscientiae et theologico. The central question to be studied is: How does Summenhart understand the concept of an individual right and its immediate implications? The basic premiss of this study is that in Opus septipartitum Summenhart composed a comprehensive theory of individual rights as a contribution to the on-going medieval discourse on rights. With this rationale, the first part of the study concentrates on earlier discussions on rights as the background for Summenhart s theory. Special attention is paid to language in which right was defined in terms of power . In the fourteenth century writers like Hervaeus Natalis and William Ockham maintained that right signifies power by which the right-holder can to use material things licitly. It will also be shown how the attempts to describe what is meant by the term right became more specified and cultivated. Gerson followed the implications that the term power had in natural philosophy and attributed rights to animals and other creatures. To secure right as a normative concept, Gerson utilized the ancient ius suum cuique-principle of justice and introduced a definition in which right was seen as derived from justice. The latter part of this study makes effort to reconstructing Summenhart s theory of individual rights in three sections. The first section clarifies Summenhart s discussion of the right of the individual or the concept of an individual right. Summenhart specified Gerson s description of right as power, taking further use of the language of natural philosophy. In this respect, Summenhart s theory managed to bring an end to a particular continuity of thought that was centered upon a view in which right was understood to signify power to licit action. Perhaps the most significant feature of Summenhart s discussion was the way he explicated the implication of liberty that was present in Gerson s language of rights. Summenhart assimilated libertas with the self-mastery or dominion that in the economic context of discussion took the form of (a moderate) self-ownership. Summenhart discussion also introduced two apparent extensions to Gerson s terminology. First, Summenhart classified right as relation, and second, he equated right with dominion. It is distinctive of Summenhart s view that he took action as the primary determinant of right: Everyone has as much rights or dominion in regard to a thing, as much actions it is licit for him to exercise in regard to the thing. The second section elaborates Summenhart s discussion of the species dominion, which delivered an answer to the question of what kind of rights exist, and clarified thereby the implications of the concept of an individual right. The central feature in Summenhart s discussion was his conscious effort to systematize Gerson s language by combining classifications of dominion into a coherent whole. In this respect, his treatement of the natural dominion is emblematic. Summenhart constructed the concept of natural dominion by making use of the concepts of foundation (founded on a natural gift) and law (according to the natural law). In defining natural dominion as dominion founded on a natural gift, Summenhart attributed natural dominion to animals and even to heavenly bodies. In discussing man s natural dominion, Summenhart pointed out that the natural dominion is not sufficiently identified by its foundation, but requires further specification, which Summenhart finds in the idea that natural dominion is appropriate to the subject according to the natural law. This characterization lead him to treat God s dominion as natural dominion. Partly, this was due to Summenhart s specific understanding of the natural law, which made reasonableness as the primary criterion for the natural dominion at the expense of any metaphysical considerations. The third section clarifies Summenhart s discussion of the property rights defined by the positive human law. By delivering an account on juridical property rights Summenhart connected his philosophical and theological theory on rights to the juridical language of his times, and demonstrated that his own language of rights was compatible with current juridical terminology. Summenhart prepared his discussion of property rights with an account of the justification for private property, which gave private property a direct and strong natural law-based justification. Summenhart s discussion of the four property rights usus, usufructus, proprietas, and possession aimed at delivering a detailed report of the usage of these concepts in juridical discourse. His discussion was characterized by extensive use of the juridical source texts, which was more direct and verbal the more his discussion became entangled with the details of juridical doctrine. At the same time he promoted his own language on rights, especially by applying the idea of right as relation. He also showed recognizable effort towards systematizing juridical language related to property rights.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This doctoral thesis addresses the macroeconomic effects of real shocks in open economies in flexible exchange rate regimes. The first study of this thesis analyses the welfare effects of fiscal policy in a small open economy, where private and government consumption are substitutes in terms of private utility. The main findings are as follows: fiscal policy raises output, bringing it closer to its efficient level, but is not welfare-improving even though government spending directly affects private utility. The main reason for this is that the introduction of useful government spending implies a larger crowding-out effect on private consumption, when compared with the `pure waste' case. Utility decreases since one unit of government consumption yields less utility than one unit of private consumption. The second study of this thesis analyses the question of how the macroeconomic effects of fiscal policy in a small open economy depend on optimal intertemporal behaviour. The key result is that the effects of fiscal policy depend on the size of the elasticity of substitution between traded and nontraded goods. In particular, the sign of the current account response to fiscal policy depends on the interplay between the intertemporal elasticity of aggregate consumption and the elasticity of substitution between traded and nontraded goods. The third study analyses the consequences of productive government spending on the international transmission of fiscal policy. A standard result in the New Open Economy Macroeconomics literature is that a fiscal shock depreciates the exchange rate. I demonstrate that the response of the exchange rate depends on the productivity of government spending. If productivity is sufficiently high, a fiscal shock appreciates the exchange rate. It is also shown that the introduction of productive government spending increases both domestic and foreign welfare, when compared with the case where government spending is wasted. The fourth study analyses the question of how the international transmission of technology shocks depends on the specification of nominal rigidities. A growing body of empirical evidence suggests that a positive technology shock leads to a temporary decline in employment. In this study, I demonstrate that the open economy dimension can enhance the ability of sticky price models to account for the evidence. The reasoning is as follows. An improvement in technology appreciates the nominal exchange rate. Under producer-currency pricing, the exchange rate appreciation shifts global demand toward foreign goods away from domestic goods. This causes a temporary decline in domestic employment.