868 resultados para damage evolution process
Resumo:
A nature inspired decentralised multi-agent algorithm is proposed to solve a problem of distributed task allocation in which cities produce and store batches of different mail types. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. The problem is constrained so that agents are penalised for switching mail types. When an agent process a mail batch of different type to the previous one, it must undergo a change-over, with repeated change-overs rendering the agent inactive. The efficiency (average amount of mail retrieved), and the flexibility (ability of the agents to react to changes in the environment) are investigated both in static and dynamic environments and with respect to sudden changes. New rules for mail selection and specialisation are proposed and are shown to exhibit improved efficiency and flexibility compared to existing ones. We employ a evolutionary algorithm which allows the various rules to evolve and compete. Apart from obtaining optimised parameters for the various rules for any environment, we also observe extinction and speciation.
Resumo:
A quasi-biotic model of knowledge evolution has been applied to manufacturing technology capability development which includes product design and development and manufacturing process/workflow improvement. The concepts of “knowledge genes” and “knowledge body” are introduced to explain the evolution of technological capability. It is shown that knowledge development within the enterprise happens as a result of interactions between an enterprise’s internal knowledge and that acquired from external sources catalysed by: (a) internal mechanisms, recources and incentives, and (b) actions and policies of external agencies. A matrix specifying factors contributing to knowledge development and types of manufacturing capabilities (product design, equipment development or use, and workflow) is developed to explain technological knowledge development. The case studies of Tianjin Pipe Corporation (TPCO) and Tianjin Tianduan Press Co. are presented to illustrate the application of the matrix.
Resumo:
This research aims to examine the effectiveness of Soft Systems Methodology (SSM) to enable systemic change within local goverment and local NHS environments and to examine the role of the facilitator within this process. Checkland's Mode 2 variant of Soft Systems Methodology was applied on an experimental basis in two environments, Herefordshire Health Authority and Sand well Health Authority. The Herefordshire application used SSM in the design of an Integrated Care Pathway for stroke patients. In Sandwell, SSM was deployed to assist in the design of an Infonnation Management and Technology (IM&T) Strategy for the boundary-spanning Sandwell Partnership. Both of these environments were experiencing significant organisational change as the experiments unfurled. The explicit objectives of the research were: To examine the evolution and development of SSM and to contribute to its further development. To apply the Soft Systems Methodology to change processes within the NHS. To evaluate the potential role of SSM in this wider process of change. To assess the role of the researcher as a facilitator within this process. To develop a critical framework through which the impact of SSM on change might be understood and assessed. In developing these objectives, it became apparent that there was a gap in knowledge relating to SSM. This gap concerns the evaluation of the role of the approach in the change process. The case studies highlighted issues in stakeholder selection and management; the communicative assumptions in SSM; the ambiguous role of the facilitator; and the impact of highly politicised problem environments on the effectiveness of the methodology in the process of change. An augmented variant on SSM that integrates an appropriate (social constructivist) evaluation method is outlined, together with a series of hypotheses about the operationalisation of this proposed method.
Resumo:
Our paper presents the work of the Cuneiform Digital Forensic Project (CDFP), an interdisciplinary project at The University of Birmingham, concerned with the development of a multimedia database to support scholarly research into cuneiform, wedge-shaped writing imprinted onto clay tablets and indeed the earliest real form of writing. We describe the evolutionary design process and dynamic research and developmental cycles associated with the database. Unlike traditional publications, the electronic publication of resources offers the possibility of almost continuous revisions with the integration and support of new media and interfaces. However, if on-line resources are to win the favor and confidence of their respective communities there must be a clear distinction between published and maintainable resources, and, developmental content. Published material should, ideally, be supported via standard web-browser interfaces with fully integrated tools so that users receive a reliable, homogenous and intuitive flow of information and media relevant to their needs. We discuss the inherent dynamics of the design and publication of our on-line resource, starting with the basic design and maintenance aspects of the electronic database, which includes photographic instances of cuneiform signs, and shows how the continuous review process identifies areas for further research and development, for example, the “sign processor” graphical search tool and three-dimensional content, the results of which then feedback into the maintained resource.
Resumo:
Halogen-containing aromatics, mainly bromine-containing phenols, are harmful compounds contaminating pyrolysis oil from electronic boards containing halogenated flame retardants. In addition, theirformation increases the potential for evolution of polybrominated dibenzo-p-dioxins (PBDDs) and dibenzofurans (PBDFs) at relatively low temperature (up to 500 °C). As a model compound, 2,4-dibromophenol (DBP) was pyrolyzed at 290-450 °C. While its pyrolysis in a nitrogen flow reactor or in encapsulated ampules yields bromine-containing phenols, phenoxyphenols, PBDDs, and PBDFs, pyrolysis of DBP in a hydrogen-donating medium of polypropylene (PP) at 290-350 °C mainly results in the formation of phenol and HBr, indicating the occurrence of a facile hydrodebromination of DBP. The hydrodebromination efficiency depends on temperature, pressure, and the ratio of the initial components. This thermal behavior of DBP is compared to that of 2,4-dichlorophenol and decabromodiphenyl ether. A treatment of halogen-containing aromatics with PP offers a new perspective on the development of low-environmental-impact disposal processes for electronic scrap. © 2005 American Chemical Society.
Resumo:
Similar to Genetic algorithm, Evolution strategy is a process of continuous reproduction, trial and selection. Each new generation is an improvement on the one that went before. This paper presents two different proposals based on the vector space model (VSM) as a traditional model in information Retrieval (TIR). The first uses evolution strategy (ES). The second uses the document centroid (DC) in query expansion technique. Then the results are compared; it was noticed that ES technique is more efficient than the other methods.
Resumo:
The body of work presented in this thesis are in three main parts: [1] the effect of ultrasound on freezing events of ionic systems, [2] the importance of formulation osmolality in freeze drying, and [3] a novel system for increasing primary freeze drying rate. Chapter 4 briefly presents the work on method optimisation, which is still very much in its infancy. Aspects of freezing such as nucleation and ice crystal growth are strongly related with ice crystal morphology; however, the ice nucleation process typically occurs in a random, non-deterministic and spontaneous manner. In view of this, ultrasound, an emerging application in pharmaceutical sciences, has been applied to aid in the acceleration of nucleation and shorten the freezing process. The research presented in this thesis aimed to study the effect of sonication on nucleation events in ionic solutions, and more importantly how sonication impacts on the freezing process. This work confirmed that nucleation does occur in a random manner. It also showed that ultrasonication aids acceleration of the ice nucleation process and increases the freezing rate of a solution. Cryopreservation of animal sperm is an important aspect of breeding in animal science especially for endangered species. In order for sperm cryopreservation to be successful, cryoprotectants as well as semen extenders are used. One of the factors allowing semen preservation media to be optimum is the osmolality of the semen extenders used. Although preservation of animal sperm has no relation with freeze drying of pharmaceuticals, it was used in this thesis to make a case for considering the osmolality of a formulation (prepared for freeze drying) as a factor for conferring protein protection against the stresses of freeze drying. The osmolalities of some common solutes (mostly sugars) used in freeze drying were determined (molal concentration from 0.1m to 1.2m). Preliminary investigation on the osmolality and osmotic coefficients of common solutes were carried out. It was observed that the osmotic coefficient trend for the sugars analysed could be grouped based on the types of sugar they are. The trends observed show the need for further studies to be carried out with osmolality and to determine how it may be of importance to protein or API protection during freeze drying processes. Primary drying is usually the longest part of the freeze drying process, and primary drying times lasting days or even weeks are not uncommon; however, longer primary drying times lead to longer freeze drying cycles, and consequently increased production costs. Much work has been done previously by others using different processes (such as annealing) in order to improve primary drying times; however, these do not come without drawbacks. A novel system involving the formation of a frozen vial system which results in the creation of a void between the formulation and the inside wall of a vial has been devised to increase the primary freeze drying rate of formulations without product damage. Although the work is not nearly complete, it has been shown that it is possible to improve and increase the primary drying rate of formulations without making any modifications to existing formulations, changing storage vials, or increasing the surface area of freeze dryer shelves.
Resumo:
This paper presents a new interpretation for the Superpave IDT strength test based on a viscoelastic-damage framework. The framework is based on continuum damage mechanics and the thermodynamics of irreversible processes with an anisotropic damage representation. The new approach introduces considerations for the viscoelastic effects and the damage accumulation that accompanies the fracture process in the interpretation of the Superpave IDT strength test for the identification of the Dissipated Creep Strain Energy (DCSE) limit from the test result. The viscoelastic model is implemented in a Finite Element Method (FEM) program for the simulation of the Superpave IDT strength test. The DCSE values obtained using the new approach is compared with the values obtained using the conventional approach to evaluate the validity of the assumptions made in the conventional interpretation of the test results. The result shows that the conventional approach over-estimates the DCSE value with increasing estimation error at higher deformation rates.
Resumo:
Mass-production, cars, pollution – they all have long become well known and well connected phenomena of the modern life. Nowadays the people can also add to the list such items like awareness, scientific approach, long-term thinking, and environmental responsibility. They are surrounded by a multitude of consumer goods, most of which are produced in a scientific manner, and all of which will more sooner than later end up in the garbage. Cars are the most noticeable – both by size and by numbers – and also the most expensive of all the mass products in people’s view. For many of them they are a clear target for reprimand and regulation, and, as a result, the automotive industry is being increasingly brought under bureaucratic control, together with its whole supplier and distributor network. The author started writing this article in an attempt to place the above process under scrutiny, because it is his firm belief that similar measures, similar tough governmental control will inevitably spill over to other industries, which at the moment are producing more inconspicuous, but still polluting products. The present paper shows the relationship between car-making, supply chain management and the efforts of public administration to protect the environment – a connection with clear practical implications.
Resumo:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^
Resumo:
The North Atlantic Treaty Organization (NATO) is a product of the Cold War through which its members organized their military forces for the purpose of collective defense against the common threat of Soviet-backed aggression. Employing the terminology of regime theory, the creation of NATO can be viewed as the introduction of an international security regime. Throughout the Cold War, NATO member states preserved their commitment to mutual defense while increasingly engaging in activities aimed at overcoming the division of Europe and promoting regional stability. The end of the Cold War has served as the catalyst for a new period of regime change as the Alliance introduced elements of a collective security regime by expanding its mandate to address new security challenges and reorganizing both its political and military organizational structures. ^ This research involves an interpretive analysis of NATO's evolution applying ideal theoretical constructs associated with distinct approaches to regime analysis. The process of regime change is investigated over several periods throughout the history of the Alliance in an effort to understand the Alliance's changing commitment to collective security. This research involves a review of regime theory literature, consisting of an examination of primary source documentation, including official documents and treaties, as well as a review of numerous secondary sources. This review is organized around a typology of power-based, organization-based, and norm-based approaches to regime analysis. This dissertation argues that the process of regime change within NATO is best understood by examining factors associated with multiple theoretical constructs. Relevant factors provide insights into the practice of collective security among NATO member states within Europe, while accounting for the inability of the NATO allies to build on the experience gained within Europe to play a more central role in operations outside of this region. This research contributes to a greater understanding of the nature of international regimes and the process of regime change, while offering recommendations aimed at increasing NATO's viability as a source of greater security and more meaningful international cooperation.^
Resumo:
In recent decades, the rapid development of optical spectroscopy for tissue diagnosis has been indicative of its high clinical value. The goal of this research is to prove the feasibility of using diffuse reflectance spectroscopy and fluorescence spectroscopy to assess myocardial infarction (MI) in vivo. The proposed optical technique was designed to be an intra-operative guidance tool that can provide useful information about the condition of an infarct for surgeons and researchers. ^ In order to gain insight into the pathophysiological characteristics of an infarct, two novel spectral analysis algorithms were developed to interpret diffuse reflectance spectra. The algorithms were developed based on the unique absorption properties of hemoglobin for the purpose of retrieving regional hemoglobin oxygenation saturation and concentration data in tissue from diffuse reflectance spectra. The algorithms were evaluated and validated using simulated data and actual experimental data. ^ Finally, the hypothesis of the study was validated using a rabbit model of MI. The mechanism by which the MI was induced was the ligation of a major coronary artery of the left ventricle. Three to four weeks after the MI was induced, the extent of myocardial tissue injury and the evolution of the wound healing process were investigated using the proposed spectroscopic methodology as well as histology. The correlations between spectral alterations and histopathological features of the MI were analyzed statistically. ^ The results of this PhD study demonstrate the applicability of the proposed optical methodology for assessing myocardial tissue damage induced by MI in vivo. The results of the spectral analysis suggest that connective tissue proliferation induced by MI significantly alter the characteristics of diffuse reflectance and fluorescence spectra. The magnitudes of the alterations could be quantitatively related to the severity and extensiveness of connective tissue proliferation.^
Resumo:
Freeze events significantly influence landscape structure and community composition along subtropical coastlines. This is particularly true in south Florida, where such disturbances have historically contributed to patch diversity within the mangrove forest, and have played a part in limiting its inland transgression. With projected increases in mean global temperatures, such instances are likely to become much less frequent in the region, contributing to a reduction in heterogeneity within the mangrove forest itself. To understand the process more clearly, we explored the dynamics of a Dwarf mangrove forest following two chilling events that produced freeze-like symptoms, i.e., leaf browning, desiccation, and mortality, and interpreted the resulting changes within the context of current winter temperatures and projected future scenarios. Structural effects from a 1996 chilling event were dramatic, with mortality and tissue damage concentrated among individuals comprising the Dwarf forest's low canopy. This disturbance promoted understory plant development and provided an opportunity for Laguncularia racemosa to share dominance with Rhizophora mangle. Mortality due to the less severe 2001 event was greatest in the understory, probably because recovery of the protective canopy following the earlier freeze was still incomplete. Stand dynamics were static over the same period in nearby unimpacted sites. The probability of reaching temperatures as low as those recorded at a nearby meteorological station (≤3 °C) under several warming scenarios was simulated by applying 1° incremental temperature increases to a model developed from a 42-year temperature record. According to the model, the frequency of similar chilling events decreased from once every 1.9 years at present to once every 3.4 and 32.5 years with 1 and 4 °C warming, respectively. The large decrease in the frequency of these events would eliminate an important mechanism that maintains Dwarf forest structure, and promotes compositional diversity.
Resumo:
In his essay - Toward a Better Understanding of the Evolution of Hotel Development: A Discussion of Product-Specific Lodging Demand - by John A. Carnella, Consultant, Laventhol & Horwath, cpas, New York, Carnella initially describes his piece by stating: “The diversified hotel product in the united states lodging market has Resulted in latent room-night demand, or supply-driven demand resulting from the introduction of a lodging product which caters to a specific set of hotel patrons. The subject has become significant as the lodging market has moved toward segmentation with regard to guest room offerings. The author proposes that latent demand is a tangible, measurable phenomenon best understood in light of the history of the guest room product from its infancy to its present state.” The article opens with an ephemeral depiction of hotel development in the United States, both pre’ and post World War II. To put it succinctly, the author wants you to know that the advent of the inter-state highway system changed the complexion of the hotel industry in the U.S. “Two essential ingredients were necessary for the next phase of hotel development in this country. First was the establishment of the magnificently intricate infrastructure which facilitated motor vehicle transportation in and around the then 48 states of the nation,” says Carnella. “The second event…was the introduction of affordable highway travel. Carnella goes on to say that the next – big thing – in hotel evolution was the introduction of affordable air travel. “With the airways filled with potential lodging guests, developers moved next to erect a new genre of hotel, the airport hotel,” Carnella advances his picture. Growth progressed with the arrival of the suburban hotel concept, which wasn’t fueled by developments in transportation, but by changes in people’s living habits, i.e. suburban affiliations as opposed to urban and city population aggregates. The author explores the distinctions between full-service and limited service lodging operations. “The market of interest with consideration to the extended-stay facility is one dominated by corporate office parks,” Carnella proceeds. These evolutional states speak to latent demand, and even further to segmentation of the market. “Latent demand… is a product-generated phenomenon in which the number of potential hotel guests increases as the direct result of the introduction of a new lodging facility,” Carnella brings his unique insight to the table with regard to the specialization process. The demand is already there; just waiting to be tapped. In closing, “…there must be a consideration of the unique attributes of a lodging facility relative to its ability to attract guests to a subject market, just as there must be an examination of the property's ability to draw guests from within the subject market,” Carnella proposes.
Resumo:
Major portion of hurricane-induced economic loss originates from damages to building structures. The damages on building structures are typically grouped into three main categories: exterior, interior, and contents damage. Although the latter two types of damages, in most cases, cause more than 50% of the total loss, little has been done to investigate the physical damage process and unveil the interdependence of interior damage parameters. Building interior and contents damages are mainly due to wind-driven rain (WDR) intrusion through building envelope defects, breaches, and other functional openings. The limitation of research works and subsequent knowledge gaps, are in most part due to the complexity of damage phenomena during hurricanes and lack of established measurement methodologies to quantify rainwater intrusion. This dissertation focuses on devising methodologies for large-scale experimental simulation of tropical cyclone WDR and measurements of rainwater intrusion to acquire benchmark test-based data for the development of hurricane-induced building interior and contents damage model. Target WDR parameters derived from tropical cyclone rainfall data were used to simulate the WDR characteristics at the Wall of Wind (WOW) facility. The proposed WDR simulation methodology presents detailed procedures for selection of type and number of nozzles formulated based on tropical cyclone WDR study. The simulated WDR was later used to experimentally investigate the mechanisms of rainwater deposition/intrusion in buildings. Test-based dataset of two rainwater intrusion parameters that quantify the distribution of direct impinging raindrops and surface runoff rainwater over building surface — rain admittance factor (RAF) and surface runoff coefficient (SRC), respectively —were developed using common shapes of low-rise buildings. The dataset was applied to a newly formulated WDR estimation model to predict the volume of rainwater ingress through envelope openings such as wall and roof deck breaches and window sill cracks. The validation of the new model using experimental data indicated reasonable estimation of rainwater ingress through envelope defects and breaches during tropical cyclones. The WDR estimation model and experimental dataset of WDR parameters developed in this dissertation work can be used to enhance the prediction capabilities of existing interior damage models such as the Florida Public Hurricane Loss Model (FPHLM).^