847 resultados para Systems-based agents


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r(2)>0.999) and low limits of detection and quantification (LOD of 0.075 μg mL(-1) and LOQ of 0.226 μg mL(-1)) in the range of 0.2-5 μg mL(-1), equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Of the many dimensions of the problem of violence exercised by men toward women in the context of the relations of partner or ex partner, this article deals with the analysis of the discursive productions of the institutional actors that are part of the judicial process. Our intention is to investigate the relationship between criminal law and gender-based violence starting from the implementation of the Law of Integral Gender-based Violence in Spain (LO. 1 / 2004) from a theoretical perspective which includes contributions from social psychology, and socio-legal feminism. We have approached the legal instrument - the Law of Integral Gender-based Violence - through the discourse of legal officers with a perspective that questions the values, so often proclaimed, of universality, objectivity and neutrality of the law

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Polymeric materials have been used in dental applications for decades. Adhesion of polymeric materials to each other and to the tooth substrate is essential to their successful use. The aim of this series of studies was two-folded. First, to improve adhesion of poly(paraphenylene) based rigid rod polymer (RRP) to other dental polymers, and secondly, to evaluate the usability of a new dentin primer system based on RRP fillers. Poly(paraphenylene) based RRP would be a tempting material for dental applications because of its good mechanical properties. To be used in dental applications, reliable adhesion between RRP and other dental polymers is required. In this series of studies, the adhesion of RRP to denture base polymer and the mechanical properties of RRP-denture base polymer-material combination were evaluated. Also adhesion of BisGMA-TEGDMA-resin to RRP was determined. Different surface treatments were tested to improve the adhesion of BisGMA-TEGDMA-resin to RRP. Results were based on three-point bending testing, Vickers surface hardness test and scanning electron microscope analysis (SEM), which showed that no reliable adhesion between RRP and denture base polymer was formed. Addition of RRP filler to denture base polymer increased surface hardness and flexural modulus but flexural strength decreased. Results from the shear bond strength test and SEM revealed that adhesion between resin and RRP was possible to improve by surface treatment with dichloromethane (DCM) based primer and a new kind of adhesive surface can be designed. The current dentin bonding agents have good immediate bond strength, but in long term the bond strength may decrease due to the detrimental effect of water and perhaps by matrix metalloproteinases. This leads to problems in longevity of restorations. Current bonding agents use organic monomers. In this series of studies, RRP filled dentin primer was tested in order to decrease the water sorption of the monomer system of the primers. The properties of new dentin primer system were evaluated in vitro by comparing it to commercial etch and rinse adhesive system. The results from the contact angle measurements and SEM showed that experimental primer with RRP reinforcement provided similar resin infiltration to dentin collagen and formed the resin-dentin interface as the control primer. Microtensile bond strength test and SEM revealed that in short term water storing, RRP increased bond strength and primer with BMEP-monomer (bis[2-(methacryloyloxy)-ethyl]phosphate) and high solvent concentration provided comparable bonding properties to the commercial control primers. In long term water storing, the high solvent-monomer concentration of the experimental primers decreased bond strength. However, in low solvent-monomer concentration groups, the long-term water storing did not decrease the bond strength despite the existence of hydrophilic monomers which were used in the system. These studies demonstrated that new dentin primer system reached the mechanical properties of current traditional etch and rinse adhesive system in short time water storing. Improved properties can be achieved by further modifications of the monomer system. Studies of the adhesion of RRP to other polymers suggest that adhesion between RRP and other dental polymers is possible to obtain by certain surface treatments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Data available in the literature were used to develop a warning system for bean angular leaf spot and anthracnose, caused by Phaeoisariopsis griseola and Colletotrichum lindemuthianum, respectively. The model is based on favorable environmental conditions for the infectious process such as continuous leaf wetness duration and mean air temperature during this subphase of the pathogen-host relationship cycle. Equations published by DALLA PRIA (1977) showing the interactions of those two factors on the disease severity were used. Excell spreadsheet was used to calculate the leaf wetness period needed to cause different infection probabilities at different temperature ranges. These data were employed to elaborate critical period tables used to program a computerized electronic device that records leaf wetness duration and mean temperature and automatically shows the daily disease severity value (DDSV) for each disease. The model should be validated in field experiments under natural infection for which the daily disease severity sum (DDSS) should be identified as a criterion to indicate the beginning and the interval of fungicide applications to control both diseases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Transportation and warehousing are large and growing sectors in the society, and their efficiency is of high importance. Transportation also has a large share of global carbondioxide emissions, which are one the leading causes of anthropogenic climate warming. Various countries have agreed to decrease their carbon emissions according to the Kyoto protocol. Transportation is the only sector where emissions have steadily increased since the 1990s, which highlights the importance of transportation efficiency. The efficiency of transportation and warehousing can be improved with the help of simulations, but models alone are not sufficient. This research concentrates on the use of simulations in decision support systems. Three main simulation approaches are used in logistics: discrete-event simulation, systems dynamics, and agent-based modeling. However, individual simulation approaches have weaknesses of their own. Hybridization (combining two or more approaches) can improve the quality of the models, as it allows using a different method to overcome the weakness of one method. It is important to choose the correct approach (or a combination of approaches) when modeling transportation and warehousing issues. If an inappropriate method is chosen (this can occur if the modeler is proficient in only one approach or the model specification is not conducted thoroughly), the simulation model will have an inaccurate structure, which in turn will lead to misleading results. This issue can further escalate, as the decision-maker may assume that the presented simulation model gives the most useful results available, even though the whole model can be based on a poorly chosen structure. In this research it is argued that simulation- based decision support systems need to take various issues into account to make a functioning decision support system. The actual simulation model can be constructed using any (or multiple) approach, it can be combined with different optimization modules, and there needs to be a proper interface between the model and the user. These issues are presented in a framework, which simulation modelers can use when creating decision support systems. In order for decision-makers to fully benefit from the simulations, the user interface needs to clearly separate the model and the user, but at the same time, the user needs to be able to run the appropriate runs in order to analyze the problems correctly. This study recommends that simulation modelers should start to transfer their tacit knowledge to explicit knowledge. This would greatly benefit the whole simulation community and improve the quality of simulation-based decision support systems as well. More studies should also be conducted by using hybrid models and integrating simulations with Graphical Information Systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Combating climate change is one of the key tasks of humanity in the 21st century. One of the leading causes is carbon dioxide emissions due to usage of fossil fuels. Renewable energy sources should be used instead of relying on oil, gas, and coal. In Finland a significant amount of energy is produced using wood. The usage of wood chips is expected to increase in the future significantly, over 60 %. The aim of this research is to improve understanding over the costs of wood chip supply chains. This is conducted by utilizing simulation as the main research method. The simulation model utilizes both agent-based modelling and discrete event simulation to imitate the wood chip supply chain. This thesis concentrates on the usage of simulation based decision support systems in strategic decision-making. The simulation model is part of a decision support system, which connects the simulation model to databases but also provides a graphical user interface for the decisionmaker. The main analysis conducted with the decision support system concentrates on comparing a traditional supply chain to a supply chain utilizing specialized containers. According to the analysis, the container supply chain is able to have smaller costs than the traditional supply chain. Also, a container supply chain can be more easily scaled up due to faster emptying operations. Initially the container operations would only supply part of the fuel needs of a power plant and it would complement the current supply chain. The model can be expanded to include intermodal supply chains as due to increased demand in the future there is not enough wood chips located close to current and future power plants.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The modern society is getting increasingly dependent on software applications. These run on processors, use memory and account for controlling functionalities that are often taken for granted. Typically, applications adjust the functionality in response to a certain context that is provided or derived from the informal environment with various qualities. To rigorously model the dependence of an application on a context, the details of the context are abstracted and the environment is assumed stable and fixed. However, in a context-aware ubiquitous computing environment populated by autonomous agents, a context and its quality parameters may change at any time. This raises the need to derive the current context and its qualities at runtime. It also implies that a context is never certain and may be subjective, issues captured by the context’s quality parameter of experience-based trustworthiness. Given this, the research question of this thesis is: In what logical topology and by what means may context provided by autonomous agents be derived and formally modelled to serve the context-awareness requirements of an application? This research question also stipulates that the context derivation needs to incorporate the quality of the context. In this thesis, we focus on the quality of context parameter of trustworthiness based on experiences having a level of certainty and referral experiences, thus making trustworthiness reputation based. Hence, in this thesis we seek a basis on which to reason and analyse the inherently inaccurate context derived by autonomous agents populating a ubiquitous computing environment in order to formally model context-awareness. More specifically, the contribution of this thesis is threefold: (i) we propose a logical topology of context derivation and a method of calculating its trustworthiness, (ii) we provide a general model for storing experiences and (iii) we formalise the dependence between the logical topology of context derivation and its experience-based trustworthiness. These contributions enable abstraction of a context and its quality parameters to a Boolean decision at runtime that may be formally reasoned with. We employ the Action Systems framework for modelling this. The thesis is a compendium of the author’s scientific papers, which are republished in Part II. Part I introduces the field of research by providing the mending elements for the thesis to be a coherent introduction for addressing the research question. In Part I we also review a significant body of related literature in order to better illustrate our contributions to the research field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Pulse Response Based Control (PRBC) is a recently developed minimum time control method for flexible structures. The flexible behavior of the structure is represented through a set of discrete time sequences, which are the responses of the structure due to rectangular force pulses. The rectangular force pulses are given by the actuators that control the structure. The set of pulse responses, desired outputs, and force bounds form a numerical optimization problem. The solution of the optimization problem is a minimum time piecewise constant control sequence for driving the system to a desired final state. The method was developed for driving positive semi-definite systems. In case the system is positive definite, some final states of the system may not be reachable. Necessary conditions for reachability of the final states are derived for systems with a finite number of degrees of freedom. Numerical results are presented that confirm the derived analytical conditions. Numerical simulations of maneuvers of distributed parameter systems have shown a relationship between the error in the estimated minimum control time and sampling interval

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Personalized nanomedicine has been shown to provide advantages over traditional clinical imaging, diagnosis, and conventional medical treatment. Using nanoparticles can enhance and clarify the clinical targeting and imaging, and lead them exactly to the place in the body that is the goal of treatment. At the same time, one can reduce the side effects that usually occur in the parts of the body that are not targets for treatment. Nanoparticles are of a size that can penetrate into cells. Their surface functionalization offers a way to increase their sensitivity when detecting target molecules. In addition, it increases the potential for flexibility in particle design, their therapeutic function, and variation possibilities in diagnostics. Mesoporous nanoparticles of amorphous silica have attractive physical and chemical characteristics such as particle morphology, controllable pore size, and high surface area and pore volume. Additionally, the surface functionalization of silica nanoparticles is relatively straightforward, which enables optimization of the interaction between the particles and the biological system. The main goal of this study was to prepare traceable and targetable silica nanoparticles for medical applications with a special focus on particle dispersion stability, biocompatibility, and targeting capabilities. Nanoparticle properties are highly particle-size dependent and a good dispersion stability is a prerequisite for active therapeutic and diagnostic agents. In the study it was shown that traceable streptavidin-conjugated silica nanoparticles which exhibit a good dispersibility could be obtained by the suitable choice of a proper surface functionalization route. Theranostic nanoparticles should exhibit sufficient hydrolytic stability to effectively carry the medicine to the target cells after which they should disintegrate and dissolve. Furthermore, the surface groups should stay at the particle surface until the particle has been internalized by the cell in order to optimize cell specificity. Model particles with fluorescently-labeled regions were tested in vitro using light microscopy and image processing technology, which allowed a detailed study of the disintegration and dissolution process. The study showed that nanoparticles degrade more slowly outside, as compared to inside the cell. The main advantage of theranostic agents is their successful targeting in vitro and in vivo. Non-porous nanoparticles using monoclonal antibodies as guiding ligands were tested in vitro in order to follow their targeting ability and internalization. In addition to the targeting that was found successful, a specific internalization route for the particles could be detected. In the last part of the study, the objective was to clarify the feasibility of traceable mesoporous silica nanoparticles, loaded with a hydrophobic cancer drug, being applied for targeted drug delivery in vitro and in vivo. Particles were provided with a small molecular targeting ligand. In the study a significantly higher therapeutic effect could be achieved with nanoparticles compared to free drug. The nanoparticles were biocompatible and stayed in the tumor for a longer time than a free medicine did, before being eliminated by renal excretion. Overall, the results showed that mesoporous silica nanoparticles are biocompatible, biodegradable drug carriers and that cell specificity can be achieved both in vitro and in vivo.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nitric oxide (NO) donors produce NO-related activity when applied to biological systems. Among its diverse functions, NO has been implicated in vascular smooth muscle relaxation. Despite the great importance of NO in biological systems, its pharmacological and physiological studies have been limited due to its high reactivity and short half-life. In this review we will focus on our recent investigations of nitrosyl ruthenium complexes as NO-delivery agents and their effects on vascular smooth muscle cell relaxation. The high affinity of ruthenium for NO is a marked feature of its chemistry. The main signaling pathway responsible for the vascular relaxation induced by NO involves the activation of soluble guanylyl-cyclase, with subsequent accumulation of cGMP and activation of cGMP-dependent protein kinase. This in turn can activate several proteins such as K+ channels as well as induce vasodilatation by a decrease in cytosolic Ca2+. Oxidative stress and associated oxidative damage are mediators of vascular damage in several cardiovascular diseases, including hypertension. The increased production of the superoxide anion (O2-) by the vascular wall has been observed in different animal models of hypertension. Vascular relaxation to the endogenous NO-related response or to NO released from NO deliverers is impaired in vessels from renal hypertensive (2K-1C) rats. A growing amount of evidence supports the possibility that increased NO inactivation by excess O2- may account for the decreased NO bioavailability and vascular dysfunction in hypertension.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Classical Pavlovian fear conditioning to painful stimuli has provided the generally accepted view of a core system centered in the central amygdala to organize fear responses. Ethologically based models using other sources of threat likely to be expected in a natural environment, such as predators or aggressive dominant conspecifics, have challenged this concept of a unitary core circuit for fear processing. We discuss here what the ethologically based models have told us about the neural systems organizing fear responses. We explored the concept that parallel paths process different classes of threats, and that these different paths influence distinct regions in the periaqueductal gray - a critical element for the organization of all kinds of fear responses. Despite this parallel processing of different kinds of threats, we have discussed an interesting emerging view that common cortical-hippocampal-amygdalar paths seem to be engaged in fear conditioning to painful stimuli, to predators and, perhaps, to aggressive dominant conspecifics as well. Overall, the aim of this review is to bring into focus a more global and comprehensive view of the systems organizing fear responses.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the new age of Internet of Things (IoT), object of everyday such as mobile smart devices start to be equipped with cheap sensors and low energy wireless communication capability. Nowadays mobile smart devices (phones, tablets) have become an ubiquitous device with everyone having access to at least one device. There is an opportunity to build innovative applications and services by exploiting these devices’ untapped rechargeable energy, sensing and processing capabilities. In this thesis, we propose, develop, implement and evaluate LoadIoT a peer-to-peer load balancing scheme that can distribute tasks among plethora of mobile smart devices in the IoT world. We develop and demonstrate an android-based proof of concept load-balancing application. We also present a model of the system which is used to validate the efficiency of the load balancing approach under varying application scenarios. Load balancing concepts can be apply to IoT scenario linked to smart devices. It is able to reduce the traffic send to the Cloud and the energy consumption of the devices. The data acquired from the experimental outcomes enable us to determine the feasibility and cost-effectiveness of a load balanced P2P smart phone-based applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.