970 resultados para Lot-sizing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability — the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is a formalism that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Unrealistic systems — possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing — cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability -- the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements -- is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems – possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing -- cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems -- not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the Cleopatra programming language. Cleopatra features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. Cleopatra is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of Cleopatra has been in use as a specification and simulation language for embedded time-critical robotic processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

TCP performance degrades when end-to-end connections extend over wireless connections-links which are characterized by high bit error rate and intermittent connectivity. Such link characteristics can significantly degrade TCP performance as the TCP sender assumes wireless losses to be congestion losses resulting in unnecessary congestion control actions. Link errors can be reduced by increasing transmission power, code redundancy (FEC) or number of retransmissions (ARQ). But increasing power costs resources, increasing code redundancy reduces available channel bandwidth and increasing persistency increases end-to-end delay. The paper proposes a TCP optimization through proper tuning of power management, FEC and ARQ in wireless environments (WLAN and WWAN). In particular, we conduct analytical and numerical analysis taking into "wireless-aware" TCP) performance under different settings. Our results show that increasing power, redundancy and/or retransmission levels always improves TCP performance by reducing link-layer losses. However, such improvements are often associated with cost and arbitrary improvement cannot be realized without paying a lot in return. It is therefore important to consider some kind of net utility function that should be optimized, thus maximizing throughput at the least possible cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless sensor networks have recently emerged as enablers of important applications such as environmental, chemical and nuclear sensing systems. Such applications have sophisticated spatial-temporal semantics that set them aside from traditional wireless networks. For example, the computation of temperature averaged over the sensor field must take into account local densities. This is crucial since otherwise the estimated average temperature can be biased by over-sampling areas where a lot more sensors exist. Thus, we envision that a fundamental service that a wireless sensor network should provide is that of estimating local densities. In this paper, we propose a lightweight probabilistic density inference protocol, we call DIP, which allows each sensor node to implicitly estimate its neighborhood size without the explicit exchange of node identifiers as in existing density discovery schemes. The theoretical basis of DIP is a probabilistic analysis which gives the relationship between the number of sensor nodes contending in the neighborhood of a node and the level of contention measured by that node. Extensive simulations confirm the premise of DIP: it can provide statistically reliable and accurate estimates of local density at a very low energy cost and constant running time. We demonstrate how applications could be built on top of our DIP-based service by computing density-unbiased statistics from estimated local densities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

System F is a type system that can be seen as both a proof system for second-order propositional logic and as a polymorphic programming language. In this work we explore several extensions of System F by types which express subtyping constraints. These systems include terms which represent proofs of subtyping relationships between types. Given a proof that one type is a subtype of another, one may use a coercion term constructor to coerce terms from the first type to the second. The ability to manipulate type constraints as first-class entities gives these systems a lot of expressive power, including the ability to encode generalized algebraic data types and intensional type analysis. The main contributions of this work are in the formulation of constraint types and a proof of strong normalization for an extension of System F with constraint types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Partial occlusions are commonplace in a variety of real world computer vision applications: surveillance, intelligent environments, assistive robotics, autonomous navigation, etc. While occlusion handling methods have been proposed, most methods tend to break down when confronted with numerous occluders in a scene. In this paper, a layered image-plane representation for tracking people through substantial occlusions is proposed. An image-plane representation of motion around an object is associated with a pre-computed graphical model, which can be instantiated efficiently during online tracking. A global state and observation space is obtained by linking transitions between layers. A Reversible Jump Markov Chain Monte Carlo approach is used to infer the number of people and track them online. The method outperforms two state-of-the-art methods for tracking over extended occlusions, given videos of a parking lot with numerous vehicles and a laboratory with many desks and workstations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

snBench is a platform on which novice users compose and deploy distributed Sense and Respond programs for simultaneous execution on a shared, distributed infrastructure. It is a natural imperative that we have the ability to (1) verify the safety/correctness of newly submitted tasks and (2) derive the resource requirements for these tasks such that correct allocation may occur. To achieve these goals we have established a multi-dimensional sized type system for our functional-style Domain Specific Language (DSL) called Sensor Task Execution Plan (STEP). In such a type system data types are annotated with a vector of size attributes (e.g., upper and lower size bounds). Tracking multiple size aspects proves essential in a system in which Images are manipulated as a first class data type, as image manipulation functions may have specific minimum and/or maximum resolution restrictions on the input they can correctly process. Through static analysis of STEP instances we not only verify basic type safety and establish upper computational resource bounds (i.e., time and space), but we also derive and solve data and resource sizing constraints (e.g., Image resolution, camera capabilities) from the implicit constraints embedded in program instances. In fact, the static methods presented here have benefit beyond their application to Image data, and may be extended to other data types that require tracking multiple dimensions (e.g., image "quality", video frame-rate or aspect ratio, audio sampling rate). In this paper we present the syntax and semantics of our functional language, our type system that builds costs and resource/data constraints, and (through both formalism and specific details of our implementation) provide concrete examples of how the constraints and sizing information are used in practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation applies a variety of quantitative methods to electricity and carbon market data, utility company accounts data, capital and operating costs to analyse some of the challenges associated with investment in energy assets. In particular, three distinct research topics are analysed within this general theme: the efficiency of interconnector trading, the optimal sizing of intermittent wind facilities and the impact of carbon pricing on the cost of capital for investors are researched in successive sections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the experiences of the biological children of foster carers. In particular it explores their experiences in relation to inclusion, consultation and decision-making. The study also examines the support and training needs of birth children in foster families. Using a qualitative methodology in-depth, semi-structured interviews were conducted with fifteen birth children of foster carers aged between 18 and 30 years. The research findings show that for the majority of birth children, fostering was overall a positive experience which helped them develop into individuals who were caring and nonjudgemental. However, from the data collected in this study, it is clear that fostering also brings a range of challenges for birth children in foster families, such as managing feelings of loss, grief, jealousy and guilt when foster children leave. Birth children are reluctant to discuss these issues with their parents and often did not approach fostering social workers as they did not have a meaningful relationship in order to discuss their concerns. The findings also demonstrate that birth children undertake a lot of emotional work in supporting their parents, birth siblings and foster siblings. Despite the important role played by birth children in the fostering process, this contribution often goes unrecognised and unacknowledged by fostering professionals and agencies with birth children not included or consulted about foster care decisions that affect them. It is argued here that birth children are viewed by foster care professionals and agencies from a deficit based perspective. However, this study contends that it is not just foster parents who are involved in the foster care process, but the entire foster family. The findings of this study show that birth children are competent social actors capable of making valuable contributions to foster care decisions that affect their lives and that of their family.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Folliculogenesis is a complex process regulated by various paracrine and autocrine factors. In vitro growth systems of primordial and preantral follicles have been developed for future use of immature oocytes, as sources of fertilizable oocytes and for studying follicular growth and oocyte maturation mechanisms. Rodents were often chosen for in vitro follicular culture research and a lot of factors implicated in folliculogenesis have been identified using this model. To date, the mouse is the only species in which the whole process of follicular growth, oocyte maturation, fertilization and embryo transfer into recipient females was successfully performed. However, the efficiency of in vitro culture systems must still be considerably improved. Within the follicle, numerous events affect cell proliferation and the acquisition of oocyte developmental competency in vitro, including interactions between the follicular cells and the oocyte, and the composition of the culture medium. Effects of the acting factors depend on the stage of follicle development, the culture system used and the species. This paper reviews the action of endocrine, paracrine factors and other components of culture medium on in vitro growth of preantral follicles in rodents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The Veterans Health Administration has developed My HealtheVet (MHV), a Web-based portal that links veterans to their care in the veteran affairs (VA) system. The objective of this study was to measure diabetic veterans' access to and use of the Internet, and their interest in using MHV to help manage their diabetes. MATERIALS AND METHODS: Cross-sectional mailed survey of 201 patients with type 2 diabetes and hemoglobin A(1c) > 8.0% receiving primary care at any of five primary care clinic sites affiliated with a VA tertiary care facility. Main measures included Internet usage, access, and attitudes; computer skills; interest in using the Internet; awareness of and attitudes toward MHV; demographics; and socioeconomic status. RESULTS: A majority of respondents reported having access to the Internet at home. Nearly half of all respondents had searched online for information about diabetes, including some who did not have home Internet access. More than a third obtained "some" or "a lot" of their health-related information online. Forty-one percent reported being "very interested" in using MHV to help track their home blood glucose readings, a third of whom did not have home Internet access. Factors associated with being "very interested" were as follows: having access to the Internet at home (p < 0.001), "a lot/some" trust in the Internet as a source of health information (p = 0.002), lower age (p = 0.03), and some college (p = 0.04). Neither race (p = 0.44) nor income (p = 0.25) was significantly associated with interest in MHV. CONCLUSIONS: This study found that a diverse sample of older VA patients with sub-optimally controlled diabetes had a level of familiarity with and access to the Internet comparable to an age-matched national sample. In addition, there was a high degree of interest in using the Internet to help manage their diabetes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the summer of 1994, Archaeology in Annapolis conducted archaeological investigations of the city block bounded by Franklin, South and Cathedral Streets in the city of Annapolis. This Phase III excavation was conducted as a means to identify subsurface cultural resources in the impact area associated with the proposed construction of the Anne Arundel County Courthouse addition. This impact area included both the upper and lower parking lots used by Courthouse employees. Investigations were conducted in the form of mechanical trenching and hand excavated units. Excavations in the upper lot area yielded significant information concerning the interior area of the block. Known as Bellis Court, this series of rowhouses was constructed in the late nineteenth century and was used as rental properties by African-Americans. The dwellings remained until the middle of the twentieth century when they were demolished in preparation for the construction of a Courthouse addition. Portions of the foundation of a house owned by William H. Bellis in the 1870s were also exposed in this area. Construction of this house was begun by William Nicholson around 1730 and completed by Daniel Dulany in 1732/33. It was demolished in 1896 by James Munroe, a Trustee for Bellis. Excavations in the upper lot also revealed the remains of a late seventeenth/early eighteenth century wood-lined cellar, believed to be part of the earliest known structure on Lot 58. After an initially rapid deposition of fill around 1828, this cellar was gradually covered with soil throughout the remainder of the nineteenth century. The fill deposit in the cellar feature yielded a mixed assemblage of artifacts that included sherds of early materials such as North Devon gravel-tempered earthenware, North Devon sgraffito and Northem Italian slipware, along with creamware, pearlware and whiteware. In the lower parking lot, numerous artifacts were recovered from yard scatter associated with the houses that at one time fronted along Cathedral Street and were occupied by African- Americans. An assemblage of late seventeenth century/early eighteenth century materials and several slag deposits from an early forge were recovered from this second area of study. The materials associated with the forge, including portions of a crucible, provided evidence of some of the earliest industry in Annapolis. Investigations in both the upper and lower parking lots added to the knowledge of the changing landscape within the project area, including a prevalence of open space in early periods, a surprising survival of impermanent structures, and a gradual regrading and filling of the block with houses and interior courts. Excavations at the Anne Arundel County Courthouse proved this to be a multi-component site, rich in cultural resources from Annapolis' Early Settlement Period through its Modern Period (as specified by Maryland's Comprehensive Historic Preservation Plan (Weissman 1986)). This report provides detailed interpretations of the archaeological findings of these Phase III investigations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Archaeological investigation at the Slayton House site in Annapolis revealed evidence of occupation of the lot since the early 18th century. The intact late 18th century ground surfaces on which John Ridout built the row houses, and subsequent changes in the landscape and use of the yard as work space in the 19th century were discovered. There was ample visible evidence of the early 20th century landscape and use of the yard as a pleasure garden when excavation was started. Deposits inside the house were quite disturbed, but there was evidence of the work done by the African Americans who lived there. A number of artifacts were found which may indicate the slaves and free African Americans were practicing African-related folk beliefs. No further investigations are recommended for the site. However, if severe or deep ground-disturbing activities were to take place on the property, they should be monitored by a qualified archaeologist.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

193 Main Street (18AP44) is located between Main Street and Duke of Gloucester Street. The property was used ass a yard related to residential and commercial buildings during the 18th and 19th centuries. In the 1930's a movie theatre and parking lot were built on the property. That structure was torn down in the 1980's and a three-story commercial building was constructed. Archaeological excavations were conducted on the property from 1985-1987. A preliminary report was written in 1986 by Paul A. Shackel. This report is the final report on the archaeological investigations at 193 Main Street.