996 resultados para Quantified real constraint
Resumo:
Les restriccions reals quantificades (QRC) formen un formalisme matemàtic utilitzat per modelar un gran nombre de problemes físics dins els quals intervenen sistemes d'equacions no-lineals sobre variables reals, algunes de les quals podent ésser quantificades. Els QRCs apareixen en nombrosos contextos, com l'Enginyeria de Control o la Biologia. La resolució de QRCs és un domini de recerca molt actiu dins el qual es proposen dos enfocaments diferents: l'eliminació simbòlica de quantificadors i els mètodes aproximatius. Tot i això, la resolució de problemes de grans dimensions i del cas general, resten encara problemes oberts. Aquesta tesi proposa una nova metodologia aproximativa basada en l'Anàlisi Intervalar Modal, una teoria matemàtica que permet resoldre problemes en els quals intervenen quantificadors lògics sobre variables reals. Finalment, dues aplicacions a l'Enginyeria de Control són presentades. La primera fa referència al problema de detecció de fallades i la segona consisteix en un controlador per a un vaixell a vela.
Resumo:
One of the techniques used to detect faults in dynamic systems is analytical redundancy. An important difficulty in applying this technique to real systems is dealing with the uncertainties associated with the system itself and with the measurements. In this paper, this uncertainty is taken into account by the use of intervals for the parameters of the model and for the measurements. The method that is proposed in this paper checks the consistency between the system's behavior, obtained from the measurements, and the model's behavior; if they are inconsistent, then there is a fault. The problem of detecting faults is stated as a quantified real constraint satisfaction problem, which can be solved using the modal interval analysis (MIA). MIA is used because it provides powerful tools to extend the calculations over real functions to intervals. To improve the results of the detection of the faults, the simultaneous use of several sliding time windows is proposed. The result of implementing this method is semiqualitative tracking (SQualTrack), a fault-detection tool that is robust in the sense that it does not generate false alarms, i.e., if there are false alarms, they indicate either that the interval model does not represent the system adequately or that the interval measurements do not represent the true values of the variables adequately. SQualTrack is currently being used to detect faults in real processes. Some of these applications using real data have been developed within the European project advanced decision support system for chemical/petrochemical manufacturing processes and are also described in this paper
Resumo:
23 p. -- An extended abstract of this work appears in the proceedings of the 2012 ACM/IEEE Symposium on Logic in Computer Science
Resumo:
We consider the quanti fied constraint satisfaction problem (QCSP) which is to decide, given a structure and a first-order sentence (not assumed here to be in prenex form) built from conjunction and quanti fication, whether or not the sentence is true on the structure. We present a proof system for certifying the falsity of QCSP instances and develop its basic theory; for instance, we provide an algorithmic interpretation of its behavior. Our proof system places the established Q-resolution proof system in a broader context, and also allows us to derive QCSP tractability results.
Resumo:
3D Virtual Environments (VE) are real; they exist as digital worlds with the advantage of having none of the constraints of the real world. As such they are the perfect training ground for design students who can create, build and experiment with design solutions without the constraint of real world projects. This paper reports on an educational setting used to explore a model for using VE such as Second Life (SL) developed by Linden Labs in California, as a collaborative environment for design education. A postgraduate landscape architecture learning environment within a collaborative design unit was developed to integrate this model where the primary focus was the application of three-dimensional tools within design, not as a presentation tool, but rather as a design tool. The focus of the unit and its aims and objectives will be outlined before describing the use of SL in the unit. Attention is focused on the collaboration and learning experience before discussing the outcomes, student feedback, future projects using this model and potential for further research. The outcome of this study aims to contribute to current research on teaching and learning design in interactive VE’s. We present a case study of our first application of this model.
Resumo:
Intelligent agents are an advanced technology utilized in Web Intelligence. When searching information from a distributed Web environment, information is retrieved by multi-agents on the client site and fused on the broker site. The current information fusion techniques rely on cooperation of agents to provide statistics. Such techniques are computationally expensive and unrealistic in the real world. In this paper, we introduce a model that uses a world ontology constructed from the Dewey Decimal Classification to acquire user profiles. By search using specific and exhaustive user profiles, information fusion techniques no longer rely on the statistics provided by agents. The model has been successfully evaluated using the large INEX data set simulating the distributed Web environment.
Resumo:
The rank transform is a non-parametric technique which has been recently proposed for the stereo matching problem. The motivation behind its application to the matching problem is its invariance to certain types of image distortion and noise, as well as its amenability to real-time implementation. This paper derives an analytic expression for the process of matching using the rank transform, and then goes on to derive one constraint which must be satisfied for a correct match. This has been dubbed the rank order constraint or simply the rank constraint. Experimental work has shown that this constraint is capable of resolving ambiguous matches, thereby improving matching reliability. This constraint was incorporated into a new algorithm for matching using the rank transform. This modified algorithm resulted in an increased proportion of correct matches, for all test imagery used.
Size-resolved particle distribution and gaseous concentrations by real-world road tunnel measurement
Resumo:
Measurements of aerosol particle number size distributions (15-700 nm), CO and NOx were performed in a bus tunnel, Australia. Daily mean particle size distributions of mixed diesel/CNG (Compressed Natural Gas) buses traffic flow were determined in 4 consecutive measurement days. EFs (Emission Factors) of Particle size distribution of diesel buses and CNG buses were obtained by MLR (Multiple Linear Regression) methods, particle distributions of diesel buses and CNG buses were observed as single accumulation mode and nuclei-mode separately. Particle size distributions of mixed traffic flow were decomposed by two log-normal fitting curves for each 30 minutes interval mean scans, all the mix fleet PSD emission can be well fitted by the summation of two log-normal distribution curves, and these were composed of nuclei mode curve and accumulation curve, which were affirmed as the CNG buses and diesel buses PN emission curves respectively. Finally, particle size distributions of diesel buses and CNG buses were quantified by statistical whisker-box charts. For log-normal particle size distribution of diesel buses, accumulation mode diameters were 74.5~87.5nm, geometric standard deviations were 1.89~1.98. As to log-normal particle size distribution of CNG buses, nuclei-mode diameters were 21~24 nm, geometric standard deviations were 1.27~1.31.
Resumo:
This project constructs a scheduling solution for the Emergency Department. The schedules are generated in real-time to adapt to new patient arrivals and changing conditions. An integrated scheduling formulation assigns patients to beds and treatment tasks to resources. The schedule efficiency is assessed using waiting time and total care time experienced by patients. The solution algorithm incorporates dispatch rules, meta-heuristics and a new extended disjunctive graph formulation which provide high quality solutions in a fast time-frame for real time decision support. This algorithm can be implemented in an electronic patient management system to improve patient flow in the Emergency Department.
Resumo:
The further development of Taqman quantitative real-time PCR (qPCR) assays for the absolute quantitation of Marek's disease virus serotype 1 (MDV1) and Herpesvirus of turkeys (HVT) viruses is described and the sensitivity and reproducibility of each assay reported. Using plasmid DNA copies, the lower limit of detection was determined to be 5 copies for the MDV1 assay and 75 copies for the HVT assay. Both assays were found to be highly reproducible for Ct values and calculated copy numbers with mean intra- and inter-assay coefficients of variation being less than 5% for Ct and 20% for calculated copy number. The genome copy number of MDV1 and HVT viruses was quantified in PBL and feather tips from experimentally infected chickens, and field poultry dust samples. Parallelism was demonstrated between the plasmid-based standard curves, and standard curves derived from infected spleen material containing both viral and host DNA, allowing the latter to be used for absolute quantification. These methods should prove useful for the reliable differentiation and absolute quantitation of MDV1 and HVT viruses in a wide range of samples.
Resumo:
This paper presents a Chance-constraint Programming approach for constructing maximum-margin classifiers which are robust to interval-valued uncertainty in training examples. The methodology ensures that uncertain examples are classified correctly with high probability by employing chance-constraints. The main contribution of the paper is to pose the resultant optimization problem as a Second Order Cone Program by using large deviation inequalities, due to Bernstein. Apart from support and mean of the uncertain examples these Bernstein based relaxations make no further assumptions on the underlying uncertainty. Classifiers built using the proposed approach are less conservative, yield higher margins and hence are expected to generalize better than existing methods. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle interval-valued uncertainty than state-of-the-art.
Resumo:
Because of the bottlenecking operations in a complex coal rail system, millions of dollars are costed by mining companies. To handle this issue, this paper investigates a real-world coal rail system and aims to optimise the coal railing operations under constraints of limited resources (e.g., limited number of locomotives and wagons). In the literature, most studies considered the train scheduling problem on a single-track railway network to be strongly NP-hard and thus developed metaheuristics as the main solution methods. In this paper, a new mathematical programming model is formulated and coded by optimization programming language based on a constraint programming (CP) approach. A new depth-first-search technique is developed and embedded inside the CP model to obtain the optimised coal railing timetable efficiently. Computational experiments demonstrate that high-quality solutions are obtainable in industry-scale applications. To provide insightful decisions, sensitivity analysis is conducted in terms of different scenarios and specific criteria. Keywords Train scheduling · Rail transportation · Coal mining · Constraint programming
Resumo:
The 1980s and the early 1990s have proved to be an important turning point in the history of the Nordic welfare states. After this breaking point, the Nordic social order has been built upon a new foundation. This study shows that the new order is mainly built upon new hierarchies and control mechanisms that have been developed consistently through economic and labour market policy measures. During the post-war period Nordic welfare states to an increasing extent created equality of opportunity and scope for agency among people. Public social services were available for all and the tax-benefit system maintained a level income distribution. During this golden era of Nordic welfare state, the scope for agency was, however, limited by social structures. Public institutions and law tended to categorize people according to their life circumstances ascribing them a predefined role. In the 1980s and 1990s this collectivist social order began to mature and it became subject to political renegotiation. Signs of a new social order in the Nordic countries have included the liberation of the financial markets, the privatizing of public functions and redefining the role of the public sector. It is now possible to reassess the ideological foundations of this new order. As a contrast to widely used political rhetoric, the foundation of the new order has not been the ideas of individual freedom or choice. Instead, the most important aim appears to have been to control and direct people to act in accordance with the rules of the market. The various levels of government and the social security system have been redirected to serve this goal. Instead of being a mechanism for redistributing income, the Nordic social security system has been geared towards creating new hierarchies on the Nordic labour markets. During the past decades, conditions for receiving income support and unemployment benefit have been tightened in all Nordic countries. As a consequence, people have been forced to accept deteriorating terms and conditions on the labour market. Country-specific variations exist, however: in sum Sweden has been most conservative, Denmark most innovative and Finland most radical in reforming labour market policy. The new hierarchies on the labour market have co-incided with slow or non-existent growth of real wages and with a strong growth of the share of capital income. Slow growth of real wages has kept inflation low and thus secured the value of capital. Societal development has thus progressed from equality of opportunity during the age of the welfare states towards a hierarchical social order where the majority of people face increasing constraints and where a fortunate minority enjoys prosperity and security.
Resumo:
Pervasive use of pointers in large-scale real-world applications continues to make points-to analysis an important optimization-enabler. Rapid growth of software systems demands a scalable pointer analysis algorithm. A typical inclusion-based points-to analysis iteratively evaluates constraints and computes a points-to solution until a fixpoint. In each iteration, (i) points-to information is propagated across directed edges in a constraint graph G and (ii) more edges are added by processing the points-to constraints. We observe that prioritizing the order in which the information is processed within each of the above two steps can lead to efficient execution of the points-to analysis. While earlier work in the literature focuses only on the propagation order, we argue that the other dimension, that is, prioritizing the constraint processing, can lead to even higher improvements on how fast the fixpoint of the points-to algorithm is reached. This becomes especially important as we prove that finding an optimal sequence for processing the points-to constraints is NP-Complete. The prioritization scheme proposed in this paper is general enough to be applied to any of the existing points-to analyses. Using the prioritization framework developed in this paper, we implement prioritized versions of Andersen's analysis, Deep Propagation, Hardekopf and Lin's Lazy Cycle Detection and Bloom Filter based points-to analysis. In each case, we report significant improvements in the analysis times (33%, 47%, 44%, 20% respectively) as well as the memory requirements for a large suite of programs, including SPEC 2000 benchmarks and five large open source programs.