661 resultados para transforms
Resumo:
Duality can be viewed as the soul of each von Neumann growth model. This is not at all surprising because von Neumann (1955), a mathematical genius, extensively studied quantum mechanics which involves a “dual nature” (electromagnetic waves and discrete corpuscules or light quanta). This may have had some influence on developing his own economic duality concept. The main object of this paper is to restore the spirit of economic duality in the investigations of the multiple von Neumann equilibria. By means of the (ir)reducibility taxonomy in Móczár (1995) the author transforms the primal canonical decomposition given by Bromek (1974) in the von Neumann growth model into the synergistic primal and dual canonical decomposition. This enables us to obtain all the information about the steadily maintainable states of growth sustained by the compatible price-constellations at each distinct expansion factor.
Resumo:
A dolgozatban a hitelderivatívák intenzitásalapú modellezésének néhány kérdését vizsgáljuk meg. Megmutatjuk, hogy alkalmas mértékcserével nemcsak a duplán sztochasztikus folyamatok, hanem tetszőleges intenzitással rendelkező pontfolyamat esetén is kiszámolható az összetett kár- és csődfolyamat eloszlásának Laplace-transzformáltja. _____ The paper addresses questions concerning the use of intensity based modeling in the pricing of credit derivatives. As the specification of the distribution of the lossprocess is a non-trivial exercise, the well-know technique for this task utilizes the inversion of the Laplace-transform. A popular choice for the model is the class of doubly stochastic processes given that their Laplace-transforms can be determined easily. Unfortunately these processes lack several key features supported by the empirical observations, e.g. they cannot replicate the self-exciting nature of defaults. The aim of the paper is to show that by using an appropriate change of measure the Laplace-transform can be calculated not only for a doubly stochastic process, but for an arbitrary point process with intensity as well. To support the application of the technique, we investigate the e®ect of the change of measure on the stochastic nature of the underlying process.
Resumo:
A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
Software engineering researchers are challenged to provide increasingly more powerful levels of abstractions to address the rising complexity inherent in software solutions. One new development paradigm that places models as abstraction at the forefront of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code.^ Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process.^ The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources.^ At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM's synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise.^ This dissertation investigates how to decouple the DSK from the MoE and subsequently producing a generic model of execution (GMoE) from the remaining application logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis component of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions.^ This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.^
Resumo:
What is the architecture of transience? What role does architecture play in the impermanent context of the nomad? What form does architecture take when our perception of shelter transforms from fixed and static to flexible and transportable? How does architecture react to the challenges of mobility and change? Traditional building forms speak of stability as an important aspect of architecture. Does portability imply a different building form? During the1950s Buckminister Fuller introduced the idea of mobile, portable structures. In the 1960s Archigrams' examples of architectural nomadism made the mobile home an accepted feature of our contemporary landscape. Currently, new materials and new methods of assembly and transportation open opportunities for rethinking portable architecture. For this thesis, a shelter was developed which provides inhabitable space and portability. The shelter was designed to be easily carried as a backpack. With minimum human effort, the structure is assembled and erected in a few minutes. Although this portable shelter needs to be maneuvered, folded and tucked away for transportation, it does meet the demands of nomadic behavior which emphasizes comfort and portability.
Resumo:
Postgraduate studies in Psychology have passed through intense process of growth and consolidation, attested by current high levels of scientific production. It is questionable, however, the return that psychological science has given to a society that has made large investments. Considering the increasing integration of Psychology in the social welfare area, a form of possible and necessary contribution is by the expansion of social policy debate. This work aimed to discuss how Psychology postgraduate studies can contribute to understand the issue of social policy. The object were academic theses defended in the 2007/2009 triennium related to one of the five thematic criteria, which resulted in 105 theses of 824 defended in the period. The main results point to the existence of the issue in Psychology programs in a sprayed way, predominantly, albeit for a limited set of researchers and programs, "social policy" appears as a priority object of research, indicating incipient systematization of these studies. Moreover, it was found that while the majority of theses can be characterized by fragility of the theoretical frameworks in relation to the subject, with most research in a strictly technical perspective, some proportion of the studies reveals concern about putting the social policy debate into a broader social context, which represents the essential condition to construct a reasoned and robust theoretical critic. In conclusion, this thesis defends that psychological science can only contribute effectively to the society development if academic community promotes a structured articulation around the theme, deepens the theoretical debate and transforms the knowledge built into organized political practice
Resumo:
The thematization of public space in the ―Maior São João do Mundo‖ in Campina Grande - PB stimulates the economy and the local tourism from the transformation of a common public space in a setting that has the traditional June festivals based. To do so, contributes to promotion of existing creative sectors in the city and the design of a new city that is projected from the festivities of São João. In this research we propose to determine the influence of the thematization of public space in the local economy, particularly in creative sectors present in the ―Maior São João do Mundo‖ and assess their importance for the development of local creative economy. We chose the case study, from an ethnographic approach, using different research techniques such as participant observation, semi-structured interviews with open questions and the analysis of social representations of respondents. The methodology used is mixed because it involves qualitative and quantitative data. We could notice at the end of this research, the thematization of public space in the ―Maior São João do Mundo‖ is the main reference factor for the event, stimulating the local economy and changing the city's image in three levels: political, economic and social. Also realize that the thematization of public space is the key binding factor between the creative sectors as well as between them and the related activities. All these sectors serve as a link between the products and services, creating a harmonic whole that transforms the city's image, stimulates the economy, promotes social inclusion, cultural integration and keeps the ―Maior São João do Mundo‖ as a traditional event in the tourist calendar regional and national.
Resumo:
One of several techniques applied to production processes oil is the artificial lift, using equipment in order to reduce the bottom hole pressure, providing a pressure differential, resulting in a flow increase. The choice of the artificial lift method depends on a detailed analysis of the some factors, such as initial costs of installation, maintenance, and the existing conditions in the producing field. The Electrical Submersible Pumping method (ESP) appears to be quite efficient when the objective is to produce high liquid flow rates in both onshore and offshore environments, in adverse conditions of temperature and in the presence of viscous fluids. By definition, ESP is a method of artificial lift in which a subsurface electric motor transforms electrical into mechanical energy to trigger a centrifugal pump of multiple stages, composed of a rotating impeller (rotor) and a stationary diffuser (stator). The pump converts the mechanical energy of the engine into kinetic energy in the form of velocity, which pushes the fluid to the surface. The objective of this work is to implement the optimization method of the flexible polyhedron, known as Modified Simplex Method (MSM) applied to the study of the influence of the modification of the input and output parameters of the centrifugal pump impeller in the channel of a system ESP. In the use of the optimization method by changing the angular parameters of the pump, the resultant data applied to the simulations allowed to obtain optimized values of the Head (lift height), lossless efficiency and the power with differentiated results.
Resumo:
The great amount of data generated as the result of the automation and process supervision in industry implies in two problems: a big demand of storage in discs and the difficulty in streaming this data through a telecommunications link. The lossy data compression algorithms were born in the 90’s with the goal of solving these problems and, by consequence, industries started to use those algorithms in industrial supervision systems to compress data in real time. These algorithms were projected to eliminate redundant and undesired information in a efficient and simple way. However, those algorithms parameters must be set for each process variable, becoming impracticable to configure this parameters for each variable in case of systems that monitor thousands of them. In that context, this paper propose the algorithm Adaptive Swinging Door Trending that consists in a adaptation of the Swinging Door Trending, as this main parameters are adjusted dynamically by the analysis of the signal tendencies in real time. It’s also proposed a comparative analysis of performance in lossy data compression algorithms applied on time series process variables and dynamometer cards. The algorithms used to compare were the piecewise linear and the transforms.
Resumo:
This dissertation consists of three distinct components: (1) “Double Rainbow,” a notated composition for an acoustic ensemble of 10 instruments, ca. 36 minutes. (2) “Appalachiana”, a fixed-media composition for electro-acoustic music and video, ca. 30 minutes, and (3) “'The Invisible Mass': Exploring Compositional Technique in Alfred Schnittke’s Second Symphony”, an analytical article.
(1) Double Rainbow is a ca. 36 minute composition in four movements scored for 10 instruments: flute, Bb clarinet (doubling on bass clarinet), tenor saxophone (doubling on alto saxophone), french horn, percussion (glockenspiel, vibraphone, wood block, 3 toms, snare drum, bass drum, suspended cymbal), piano, violin, viola, cello, and double bass. Each of the four movements of the piece explore their own distinct character and set of compositional goals. The piece is presented as a musical score and as a recording, which was extensively treated in post-production.
(2) Appalachiana, is a ca. 30 minute fixed-media composition for music and video. The musical component was created as a vehicle to showcase several approaches to electro-acoustic music composition –fft re-synthesis for time manipulation effects, the use of a custom-built software instrument which implements generative approaches to creating rhythm and pitch patterns, using a recording of rain to create rhythmic triggers for software instruments, and recording additional components with acoustic instruments. The video component transforms footage of natural landscapes filmed at several locations in North Carolina, Virginia, and West Virginia into a surreal narrative using a variety of color, lighting, distortion, and time-manipulation video effects.
(3) “‘The Invisible Mass:’ Exploring Compositional Technique in Alfred Schnittke’s Second Symphony” is an analytical article that focuses on Alfred Schnittke’s compositional technique as evidenced in the construction of his Second Symphony and discussed by the composer in a number of previously untranslated articles and interviews. Though this symphony is pivotal in the composer’s oeuvre, there are currently no scholarly articles that offer in-depth analyses of the piece. The article combines analyses of the harmony, form, and orchestration in the Second Symphony with relevant quotations from the composer, some from published and translated sources and others newly translated by the author from research at the Russian State Library in St. Petersburg. These offer a perspective on how Schnittke’s compositional technique combines systematic geometric design with keen musical intuition.
Resumo:
Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.
Resumo:
This article examines the structure and health implications of two industries, chicken and tomatoes, that play prominent roles in US food and agricultural competitiveness. Both industries have become more concentrated over time, with powerful "lead firms" driving geographical, technological, and marketing changes. Overall, a processed food revolution has taken place in agricultural products that transforms the types of food and dietary options available to consumers. The nature of contemporary food and agricultural value chains affects the strategies and policies that can be effectively employed to address major health goals such as improved nutrition, food safety, and food security.
Resumo:
Monitoring and enforcement are perhaps the biggest challenges in the design and implementation of environmental policies in developing countries where the actions of many small informal actors cause significant impacts on the ecosystem services and where the transaction costs for the state to regulate them could be enormous. This dissertation studies the potential of innovative institutions based on decentralized coordination and enforcement to induce better environmental outcomes. Such policies have in common that the state plays the role of providing the incentives for organization but the process of compliance happens through decentralized agreements, trust building, signaling and monitoring. I draw from the literatures in collective action, common-pool resources, game-theory and non-point source pollution to develop the instruments proposed here. To test the different conditions in which such policies could be implemented I designed two field-experiments that I conducted with small-scale gold miners in the Colombian Pacific and with users and providers of ecosystem services in the states of Veracruz, Quintana Roo and Yucatan in Mexico. This dissertation is organized in three essays.
The first essay, “Collective Incentives for Cleaner Small-Scale Gold Mining on the Frontier: Experimental Tests of Compliance with Group Incentives given Limited State Monitoring”, examines whether collective incentives, i.e. incentives provided to a group conditional on collective compliance, could “outsource” the required local monitoring, i.e. induce group interactions that extend the reach of the state that can observe only aggregate consequences in the context of small-scale gold mining. I employed a framed field-lab experiment in which the miners make decisions regarding mining intensity. The state sets a collective target for an environmental outcome, verifies compliance and provides a group reward for compliance which is split equally among members. Since the target set by the state transforms the situation into a coordination game, outcomes depend on expectations of what others will do. I conducted this experiment with 640 participants in a mining region of the Colombian Pacific and I examine different levels of policy severity and their ordering. The findings of the experiment suggest that such instruments can induce compliance but this regulation involves tradeoffs. For most severe targets – with rewards just above costs – raise gains if successful but can collapse rapidly and completely. In terms of group interactions, better outcomes are found when severity initially is lower suggesting learning.
The second essay, “Collective Compliance can be Efficient and Inequitable: Impacts of Leaders among Small-Scale Gold Miners in Colombia”, explores the channels through which communication help groups to coordinate in presence of collective incentives and whether the reached solutions are equitable or not. Also in the context of small-scale gold mining in the Colombian Pacific, I test the effect of communication in compliance with a collective environmental target. The results suggest that communication, as expected, helps to solve coordination challenges but still some groups reach agreements involving unequal outcomes. By examining the agreements that took place in each group, I observe that the main coordination mechanism was the presence of leaders that help other group members to clarify the situation. Interestingly, leaders not only helped groups to reach efficiency but also played a key role in equity by defining how the costs of compliance would be distributed among group members.
The third essay, “Creating Local PES Institutions and Increasing Impacts of PES in Mexico: A real-Time Watershed-Level Framed Field Experiment on Coordination and Conditionality”, considers the creation of a local payments for ecosystem services (PES) mechanism as an assurance game that requires the coordination between two groups of participants: upstream and downstream. Based on this assurance interaction, I explore the effect of allowing peer-sanctions on upstream behavior in the functioning of the mechanism. This field-lab experiment was implemented in three real cases of the Mexican Fondos Concurrentes (matching funds) program in the states of Veracruz, Quintana Roo and Yucatan, where 240 real users and 240 real providers of hydrological services were recruited and interacted with each other in real time. The experimental results suggest that initial trust-game behaviors align with participants’ perceptions and predicts baseline giving in assurance game. For upstream providers, i.e. those who get sanctioned, the threat and the use of sanctions increase contributions. Downstream users contribute less when offered the option to sanction – as if that option signal an uncooperative upstream – then the contributions rise in line with the complementarity in payments of the assurance game.
Resumo:
Pour respecter les droits d’auteur, la version électronique de ce mémoire a été dépouillée de certains documents visuels et audio-visuels. La version intégrale du mémoire a été déposée au Service de la gestion des documents et des archives de l'Université de Montréal