956 resultados para Covering law model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results: Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions: Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a simple rheological model to describe the thixotropic behavior of paints, since the classical hysteresis area, which is usually used, is not enough to evaluate thixotropy. The model is based on the assumption that viscosity is a direct measure of the structural level of the paint. The model depends on two equations: the Cross-Carreau equation to describe the equilibrium viscosity and a second order kinetic equation to express the time dependence of viscosity. Two characteristic thixotropic times are differentiated: one for the net structure breakdown, which is defined as a power law function of shear rate, and an other for the net structure buildup, which is not dependent on the shear rate. The knowledge of both kinetic processes can be used to improve the quality and applicability of paints. Five representative commercial protective marine paints are tested. They are based on chlorinated rubber, acrylic, alkyd, vinyl, and epoxy resins. The temperature dependence of the rheological behavior is also studied with the temperature ranging from 5 ºC to 35 ºC. It is found that the paints exhibit both shear thinning and thixotropic behavior. The model fits satisfactorily the thixotropy of the studied paints. It is also able to predict the thixotropy dependence on temperature. Both viscosity and the degree of thixotropy increase as the temperature decreases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines the impacts of the Swiss reform of the allocation of tasks which was accepted in 2004 and implemented in 2008 to "re-assign" the responsibilities between the federal government and the cantons. The public tasks were redistributed, according to the leading and fundamental principle of subsidiarity. Seven tasks came under exclusive federal responsibility; ten came under the control of the cantons; and twenty-two "common tasks" were allocated to both the Confederation and the cantons. For these common tasks it wasn't possible to separate the management and the implementation. In order to deal with nineteen of them, the reform introduced the conventions-programs (CPs), which are public law contracts signed by the Confederation with each canton. These CPs are generally valid for periods of four years (2008-11, 2012-15 and 2016-19, respectively). The third period is currently being prepared. By using the principal-agent theory I examine how contracts can improve political relations between a principal (Confederation) and an agent (canton). I also provide a first qualitative analysis by examining the impacts of these contracts on the vertical cooperation and on the implication of different actors by focusing my study on five CPs - protection of cultural heritage and conservation of historic monuments, encouragement of the integration of foreigners, economic development, protection against noise and protection of the nature and landscape - applied in five cantons, which represents twenty-five cases studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the scale invariants in the preparation of highly concentrated w/o emulsions at different scales and in varying conditions. The emulsions are characterized using rheological parameters, owing to their highly elastic behavior. We first construct and validate empirical models to describe the rheological properties. These models yield a reasonable prediction of experimental data. We then build an empirical scale-up model, to predict the preparation and composition conditions that have to be kept constant at each scale to prepare the same emulsion. For this purpose, three preparation scales with geometric similarity are used. The parameter N¿D^α, as a function of the stirring rate N, the scale (D, impeller diameter) and the exponent α (calculated empirically from the regression of all the experiments in the three scales), is defined as the scale invariant that needs to be optimized, once the dispersed phase of the emulsion, the surfactant concentration, and the dispersed phase addition time are set. As far as we know, no other study has obtained a scale invariant factor N¿Dα for the preparation of highly concentrated emulsions prepared at three different scales, which covers all three scales, different addition times and surfactant concentrations. The power law exponent obtained seems to indicate that the scale-up criterion for this system is the power input per unit volume (P/V).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Per diferents motius, l'acció de la justícia es troba permanentment d'actualitat. Una de les causes és la contínua novetat que prové de les propostes de modernització en els diversos àmbits, que tracten de pal·liar els dèficits amb els quals s'enfronta cada dia l'acció judicial. El debat és ja antic i permanent, sent que, simultàniament un ventall de projectes ha aparegut amb la intenció de dur a terme un canvi profund i determinant en la visió de l'estructura del Poder Judicial i dels serveis associats. En particular, la proposta de reforma de Llei Orgànica del Poder Judicial i de Demarcació i Planta, suposa la concentració de tots els jutjats a les capitals de província, amb el risc d'allunyament enfront dels ciutadans i problemes associats. L'esborrany de Codi Processal Penal aposta per un nou sistema de recerca a càrrec del Ministeri Fiscal, amb curts terminis taxats per a la finalització del procediment, promoció de la mediació i augment de les possibilitats de no continuació de la causa, a més una concentració i simplificació de totes les fases del procediment penal. D'altra banda, el Registre Civil ha estat retirat dels jutjats però, malgrat el transcurs del temps, no s'és capaç d'identificar a quin col·lectiu li correspondrà aquesta funció, amb la conseqüent generació d'una important inquietud social. Aquestes i altres novetats seran analitzades en profunditat, amb especial perspectiva des de l'àmbit de Catalunya, tractant d'aportar solucions i propostes de millora.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The legislative reforms in university matters driven in recent years, beyond the provoked controversies, offer to universities the possibility to develop a new model in line with the European environment, focusing on quality aims and adapting to the socioeconomic current challenges. A new educational model centered on the student, on the formation of specific and transverse competitions, on the improvement of the employability and the access to the labor market, on the attraction and fixation of talent, is an indispensable condition for the effective social mobility and for the homogeneous development of a more responsible and sustainable socioeconomic and productive model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tavoitteena tässä diplomityössä oli selvittää Kotkan Energia Oy:n jätteenpolttolaitoksen toiminnassa syntyvän pohjakuonan hyötykäyttömahdollisuuksia ja kustannuksiltaan järkevin tapa käsitellä pohjakuona. Tarkasteltavia vaihtoehtoja hyötykäytön osalta olivat erilaiset maarakennustyöt sekä kaatopaikkojen peittäminen. Hyötykäyttövaihtoehdoille vaihtoehtoisena ratkaisuna tutkittiin kaatopaikkasijoituksen kannattavuutta. Pohjakuonan käyttöä maarakentamisessa, jätetäyttöjen peitemaina sekä pohjakuonan loppusijoitusta kaatopaikoille säätelee lainsäädäntö. Valtioneuvoston asetuksessa kaatopaikoista on määritelty loppusijoitettaville jakeille liukoisuus raja-arvot, joiden mukaisesti jätteet sijoitetaan joko pysyvän jätteen, tavanomaisen jätteen tai ongelmajätteen kaatopaikoille. Jätetäyttöjen peitemateriaalina käytettävän jakeen on myös täytettävä kyseisen kaatopaikkaluokan liukoisuusraja-arvot. Maarakennusasetus antaa puolestaan liukoisuus- ja pitoisuusraja-arvot maarakennuksessa hyödynnettäville jakeille. Pohjakuonan hyötykäyttö vaatii kuitenkin aina ympäristöluvan. Hyötyvoimalan pohjakuonasta tehtyjen analyysien perusteella voidaan todeta, että käsiteltyä pohjakuonaa olisi mahdollista käyttää esimerkiksi kenttien ja kaatopaikkojen pohjarakenteissa sekä kaatopaikoilla jätetäyttöjen peitemaina. Pohjakuonan hyödyntäminen kaukolämpötöiden maarakennuksessa vaatii vielä lisätutkimuksia putkien korroosioriskin osalta. Diplomityön tuloksena kehitettiin toimintamalli, jonka avulla Kotkan Energia Oy:n on mahdollista säästää kuonan käsittelyn ja hyötykäytön kustannuksissa. Kustannuksiltaan parhaaksi vaihtoehdoksi osoittautui kuonan käsittely omalla biopolttoaineiden välivarastolla sekä käsitellyn pohjakuonan hyödyntäminen välivaraston pohjarakenteissa. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT One of the most relevant activities of Brazilian economy is agriculture. Among the main crops in Brazil, rice is one of high relevance. The state of Rio Grande do Sul, in Southern Brazil, is responsible for 68.7% of domestic production (IBGE, 2013). The goal of this study was to develop a low-cost methodology with a regional scope to identify suitable areas for irrigated rice cropping in this state, using spectro-temporal behavior of vegetation index by means of MODIS images and HAND model. The rice-cropped area of this study was the southern half of the State. Using the HAND model, flood areas were mapped to identify irrigated rice cultivation. We used multi-temporal images of vegetation index from MODIS sensor, covering the period from August 2001 to May 2012. To assess the results, we used data collected in the fields and cropped area information from IBGE. The results showed that the proposed methodology was satisfactory, with Kappa 0.92 and global accuracy of 98.18%. As result, MODIS sensor data and flood areas delineation by means of HAND model generated the estimate irrigated rice area for the area of study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a new law of the wall formulation for recirculating turbulent flows. An alternative expression for the internal length which can be applied in the separated region is also presented. The formulation is implemented in a numerical code which solves the k-epsilon model through a finite volume method. The theoretical results are compared with the experimental data of Vogel and Eaton (J. of Heat Transfer, Transactions of ASME, vol.107, pp. 922-929, 1985). The paper shows that the present formulation furnishes better results than the standard k-epsilon formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A control law was designed for a satellite launcher ( rocket ) vehicle using eigenstructure assignment in order that the vehicle tracks a reference attitude and also to decouple the yaw response from roll and pitch manoeuvres and to decouple the pitch response from roll and yaw manoeuvres. The design was based on a complete linear coupled model obtained from the complete vehicle non linear model by linearization at each trajectory point. After all, the design was assessed with the vehicle time varying non-linear model showing a good performance and robustness. The used design method is explained and a case study for the Brazilian satellite launcher ( VLS Rocket ) is reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mass transfer kinetics in osmotic dehydration is usually modeled by Fick's law, empirical models and probabilistic models. The aim of this study was to determine the applicability of Peleg model to investigate the mass transfer during osmotic dehydration of mackerel (Scomber japonicus) slices at different temperatures. Osmotic dehydration was performed on mackerel slices by cooking-infusion in solutions with glycerol and salt (a w = 0.64) at different temperatures: 50, 70, and 90 ºC. Peleg rate constant (K1) (h(g/gdm)-1) varied with temperature variation from 0.761 to 0.396 for water loss, from 5.260 to 2.947 for salt gain, and from 0.854 to 0.566 for glycerol intake. In all cases, it followed the Arrhenius relationship (R²>0.86). The Ea (kJ / mol) values obtained were 16.14; 14.21, and 10.12 for water, salt, and glycerol, respectively. The statistical parameters that qualify the goodness of fit (R²>0.91 and RMSE<0.086) indicate promising applicability of Peleg model.