952 resultados para Set of the Basis Solutions
Resumo:
The particle-based Lattice Solid Model (LSM) was developed to provide a basis to study the physics of rocks and the nonlinear dynamics of earthquakes (MORA and PLACE, 1994; PLACE and MORA, 1999). A new modular and flexible LSM approach has been developed that allows different microphysics to be easily included in or removed from the model. The approach provides a virtual laboratory where numerical experiments can easily be set up and all measurable quantities visualised. The proposed approach provides a means to simulate complex phenomena such as fracturing or localisation processes, and enables the effect of different micro-physics on macroscopic behaviour to be studied. The initial 2-D model is extended to allow three-dimensional simulations to be performed and particles of different sizes to be specified. Numerical bi-axial compression experiments under different confining pressure are used to calibrate the model. By tuning the different microscopic parameters (such as coefficient of friction, microscopic strength and distribution of grain sizes), the macroscopic strength of the material and can be adjusted to be in agreement with laboratory experiments, and the orientation of fractures is consistent with the theoretical value predicted based on Mohr-Coulomb diagram. Simulations indicate that 3-D numerical models have different macroscopic properties than in 2-D and, hence, the model must be recalibrated for 3-D simulations. These numerical experiments illustrate that the new approach is capable of simulating typical rock fracture behaviour. The new model provides a basis to investigate nucleation, rupture and slip pulse propagation in complex fault zones without the previous model limitations of a regular low-level surface geometry and being restricted to two-dimensions.
Resumo:
Improvement in analysis and reporting results of osteoarthritis (OA) clinical trials has been recently obtained because of harmonization and standardization of the selection of outcome variables (OMERACT 3 and OARSI). Moreover, OARSI has recently proposed the OARSI responder criteria. This composite index permits presentation of results of symptom modifying clinical trials in OA based on individual patient responses (responder yes/no). The 2 organizations (OMERACT and OARSI) established. a task force aimed at evaluating: (1) the variability of observed placebo and active treatment effects using the OARSI responder criteria; and (2) the possibility of proposing a simplified set of criteria. The conclusions of the task force were presented and discussed during the OMERACT 6 conference, where a simplified set of responder criteria (OMERACT-OARSI set of criteria) was proposed.
Resumo:
In this paper we investigate the construction of state models for link invariants using representations of the braid group obtained from various gauge choices for a solution of the trigonometric Yang-Baxter equation. Our results show that it is possible to obtain invariants of regular isotopy (as defined by Kauffman) which may not be ambient isotopic. We illustrate our results with explicit computations using solutions of the trigonometric Yang-Baxter equation associated with the one-parameter family of minimal typical representations of the quantum superalgebra U-q,[gl(2/1)]. We have implemented MATHEMATICA code to evaluate the invariants for all prime knots up to 10 crossings.
Resumo:
The Commonwealth Government's Principles Based Review of the Law of Negligence recently recommended reforms aimed at limiting liability and damages arising from personal injury and death, in response to the growing perception that the current system of compensating personal injury had become financially unsustainable. Recent increases in medical liability and damages have eroded the confidence of doctors and their professional bodies, with fears of unprecedented desertion from and reduced recruitment into high risk areas, and one of the primary foci of the review concerned medical negligence. The article analyses proposals to redefine the principles necessary for the finding of negligence, against the terms of reference of the review. The article assumes that for the foreseeable future, Australia will persist with tort-based compensation for personal injury rather than developing a no-fault scheme. If the suggested changes to the fundamental principles of negligence are unlikely to reduce medical liability, greater attention might be given to the processes which come into play after the finding of negligence, where reform is more likely to benefit both plaintiffs and defendants.
Resumo:
Histidines 107 and 109 in the glycine receptor ( GlyR) alpha(1) subunit have previously been identified as determinants of the inhibitory zinc-binding site. Based on modeling of the GlyR alpha(1) subunit extracellular domain by homology to the acetylcholine-binding protein crystal structure, we hypothesized that inhibitory zinc is bound within the vestibule lumen at subunit interfaces, where it is ligated by His(107) from one subunit and His(109) from an adjacent subunit. This was tested by co-expressing alpha(1) subunits containing the H107A mutation with alpha(1) subunits containing the H109A mutation. Although sensitivity to zinc inhibition is markedly reduced when either mutation is individually incorporated into all five subunits, the GlyRs formed by the co-expression of H107A mutant subunits with H109A mutant subunits exhibited an inhibitory zinc sensitivity similar to that of the wild type alpha(1) homomeric GlyR. This constitutes strong evidence that inhibitory zinc is coordinated at the interface between adjacent alpha(1) subunits. No evidence was found for beta subunit involvement in the coordination of inhibitory zinc, indicating that a maximum of two zinc-binding sites per alpha(1)beta receptor is sufficient for maximal zinc inhibition. Our data also show that two zinc-binding sites are sufficient for significant inhibition of alpha(1) homomers. The binding of zinc at the interface between adjacent alpha(1) subunits could restrict intersubunit movements, providing a feasible mechanism for the inhibition of channel activation by zinc.
Resumo:
A probe tack test has been used for the in situ characterization of the surface stickiness of hemispherical drops with an initial radius of 3.5 mm while drying. Surface stickiness of drops of fructose and maltodextrin solutions dried at 63degreesC and 95degreesC was determined. The effect of addition of maltodextrin on fructose solution-was studied with fructose/maltodextrin solid mass ratios of 4: 1, 1: 1, and 1:4. Pure fructose solutions remained completely sticky and failed cohesively even when their moisture approached zero. Shortly after the start of drying, the surface of the maltodextrin drops formed a skin, which rapidly grew in thickness. Subsequently the drop surface became completely nonsticky probably due to transformation of outer layers into a glassy material. Addition of malto,dextrin significantly altered the surface stickiness of drops of fructose solutions, demonstrating its use as an effective drying aid.
Resumo:
We discuss existence and multiplicity of positive solutions of the Dirichlet problem for the quasilinear ordinary differential equation-(u' / root 1 - u'(2))' = f(t, u). Depending on the behaviour of f = f(t, s) near s = 0, we prove the existence of either one, or two, or three, or infinitely many positive solutions. In general, the positivity of f is not required. All results are obtained by reduction to an equivalent non-singular problem to which variational or topological methods apply in a classical fashion.
Resumo:
A new general fitting method based on the Self-Similar (SS) organization of random sequences is presented. The proposed analytical function helps to fit the response of many complex systems when their recorded data form a self-similar curve. The verified SS principle opens new possibilities for the fitting of economical, meteorological and other complex data when the mathematical model is absent but the reduced description in terms of some universal set of the fitting parameters is necessary. This fitting function is verified on economical (price of a commodity versus time) and weather (the Earth’s mean temperature surface data versus time) and for these nontrivial cases it becomes possible to receive a very good fit of initial data set. The general conditions of application of this fitting method describing the response of many complex systems and the forecast possibilities are discussed.
Resumo:
The Darwinian Particle Swarm Optimization (DPSO) is an evolutionary algorithm that extends the Particle Swarm Optimization using natural selection to enhance the ability to escape from sub-optimal solutions. An extension of the DPSO to multi-robot applications has been recently proposed and denoted as Robotic Darwinian PSO (RDPSO), benefiting from the dynamical partitioning of the whole population of robots, hence decreasing the amount of required information exchange among robots. This paper further extends the previously proposed algorithm adapting the behavior of robots based on a set of context-based evaluation metrics. Those metrics are then used as inputs of a fuzzy system so as to systematically adjust the RDPSO parameters (i.e., outputs of the fuzzy system), thus improving its convergence rate, susceptibility to obstacles and communication constraints. The adapted RDPSO is evaluated in groups of physical robots, being further explored using larger populations of simulated mobile robots within a larger scenario.
Resumo:
Due to the growing complexity and adaptability requirements of real-time systems, which often exhibit unrestricted Quality of Service (QoS) inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may be inter-dependent. This paper focuses on optimising a dynamic local set of inter-dependent tasks that can be executed at varying levels of QoS to achieve an efficient resource usage that is constantly adapted to the specific constraints of devices and users, nature of executing tasks and dynamically changing system conditions. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.
Resumo:
The long term evolution (LTE) is one of the latest standards in the mobile communications market. To achieve its performance, LTE networks use several techniques, such as multi-carrier technique, multiple-input-multiple-output and cooperative communications. Inside cooperative communications, this paper focuses on the fixed relaying technique, presenting a way for determining the best position to deploy the relay station (RS), from a set of empirical good solutions, and also to quantify the associated performance gain using different cluster size configurations. The best RS position was obtained through realistic simulations, which set it as the middle of the cell's circumference arc. Additionally, it also confirmed that network's performance is improved when the number of RSs is increased. It was possible to conclude that, for each deployed RS, the percentage of area served by an RS increases about 10 %. Furthermore, the mean data rate in the cell has been increased by approximately 60 % through the use of RSs. Finally, a given scenario with a larger number of RSs, can experience the same performance as an equivalent scenario without RSs, but with higher reuse distance. This conduces to a compromise solution between RS installation and cluster size, in order to maximize capacity, as well as performance.
Resumo:
It is important to understand and forecast a typical or a particularly household daily consumption in order to design and size suitable renewable energy systems and energy storage. In this research for Short Term Load Forecasting (STLF) it has been used Artificial Neural Networks (ANN) and, despite the consumption unpredictability, it has been shown the possibility to forecast the electricity consumption of a household with certainty. The ANNs are recognized to be a potential methodology for modeling hourly and daily energy consumption and load forecasting. Input variables such as apartment area, numbers of occupants, electrical appliance consumption and Boolean inputs as hourly meter system were considered. Furthermore, the investigation carried out aims to define an ANN architecture and a training algorithm in order to achieve a robust model to be used in forecasting energy consumption in a typical household. It was observed that a feed-forward ANN and the Levenberg-Marquardt algorithm provided a good performance. For this research it was used a database with consumption records, logged in 93 real households, in Lisbon, Portugal, between February 2000 and July 2001, including both weekdays and weekend. The results show that the ANN approach provides a reliable model for forecasting household electric energy consumption and load profile. © 2014 The Author.
Resumo:
We study the existence and multiplicity of positive radial solutions of the Dirichlet problem for the Minkowski-curvature equation { -div(del upsilon/root 1-vertical bar del upsilon vertical bar(2)) in B-R, upsilon=0 on partial derivative B-R,B- where B-R is a ball in R-N (N >= 2). According to the behaviour off = f (r, s) near s = 0, we prove the existence of either one, two or three positive solutions. All results are obtained by reduction to an equivalent non-singular one-dimensional problem, to which variational methods can be applied in a standard way.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau em Mestre em Engenharia Física
Resumo:
Manufacturing processes need permanently to innovate and optimize because any can be susceptible to continuous improvement. Innovation and commitment to the development of these new solutions resulting from existing expertise and the continuing need to increase productivity, flexibility and ensuring the necessary quality of the manufactured products. To increase flexibility, it is necessary to significantly reduce set-up times and lead time in order to ensure the delivery of products ever faster. This objective can be achieved through a normalization of the pultrusion line elements. Implicitly, there is an increase of productivity by this way. This work is intended to optimize the pultrusion process of structural profiles. We consider all elements of the system from the storehouse of the fibers (rack) to the pultrusion die. Particular attention was devoted to (a) the guidance system of the fibers and webs, (b) the resin container where the fibers are impregnated, (c) standard plates positioning of the fibers towards the entrance to the spinneret and also (d) reviewed the whole process of assembling and fixing the die as well as its the heating system. With the implementation of these new systems was achieved a significant saving of time set-up and were clearly reduced the unit costs of production. Quality assurance was also increased.