912 resultados para Language design
Resumo:
The effective daylighting of multistorey commercial building interiors poses an interesting problem for designers in Australia’s tropical and subtropical context. Given that a building exterior receives adequate sun and skylight as dictated by location-specific factors such as weather, siting and external obstructions; then the availability of daylight throughout its interior is dependant on certain building characteristics: the distance from a window façade (room depth), ceiling or window head height, window size and the visible transmittance of daylighting apertures. The daylighting of general stock, multistorey commercial buildings is made difficult by their design limitations with respect to some of these characteristics. The admission of daylight to these interiors is usually exclusively by vertical windows. Using conventional glazing, such windows can only admit sun and skylight to a depth of approximately 2 times the window height. This penetration depth is typically much less than the depth of the office interiors, so that core areas of these buildings receive little or no daylight. This issue is particularly relevant where deep, open plan office layouts prevail. The resulting interior daylight pattern is a relatively narrow perimeter zone bathed in (sometimes too intense) light, contrasted with a poorly daylit core zone. The broad luminance range this may present to a building occupant’s visual field can be a source of discomfort glare. Furthermore, the need in most tropical and subtropical regions to restrict solar heat gains to building interiors for much of the year has resulted in the widespread use of heavily tinted or reflective glazing on commercial building façades. This strategy reduces the amount of solar radiation admitted to the interior, thereby decreasing daylight levels proportionately throughout. However this technique does little to improve the way light is distributed throughout the office space. Where clear skies dominate weather conditions, at different times of day or year direct sunlight may pass unobstructed through vertical windows causing disability or discomfort glare for building occupants and as such, its admission to an interior must be appropriately controlled. Any daylighting system to be applied to multistorey commercial buildings must consider these design obstacles, and attempt to improve the distribution of daylight throughout these deep, sidelit office spaces without causing glare conditions. The research described in this thesis delineates first the design optimisation and then the actual prototyping and manufacture process of a daylighting device to be applied to such multistorey buildings in tropical and subtropical environments.
Resumo:
The topic of the present work is to study the relationship between the power of the learning algorithms on the one hand, and the expressive power of the logical language which is used to represent the problems to be learned on the other hand. The central question is whether enriching the language results in more learning power. In order to make the question relevant and nontrivial, it is required that both texts (sequences of data) and hypotheses (guesses) be translatable from the “rich” language into the “poor” one. The issue is considered for several logical languages suitable to describe structures whose domain is the set of natural numbers. It is shown that enriching the language does not give any advantage for those languages which define a monadic second-order language being decidable in the following sense: there is a fixed interpretation in the structure of natural numbers such that the set of sentences of this extended language true in that structure is decidable. But enriching the original language even by only one constant gives an advantage if this language contains a binary function symbol (which will be interpreted as addition). Furthermore, it is shown that behaviourally correct learning has exactly the same power as learning in the limit for those languages which define a monadic second-order language with the property given above, but has more power in case of languages containing a binary function symbol. Adding the natural requirement that the set of all structures to be learned is recursively enumerable, it is shown that it pays o6 to enrich the language of arithmetics for both finite learning and learning in the limit, but it does not pay off to enrich the language for behaviourally correct learning.
Resumo:
Automatic spoken Language Identi¯cation (LID) is the process of identifying the language spoken within an utterance. The challenge that this task presents is that no prior information is available indicating the content of the utterance or the identity of the speaker. The trend of globalization and the pervasive popularity of the Internet will amplify the need for the capabilities spoken language identi¯ca- tion systems provide. A prominent application arises in call centers dealing with speakers speaking di®erent languages. Another important application is to index or search huge speech data archives and corpora that contain multiple languages. The aim of this research is to develop techniques targeted at producing a fast and more accurate automatic spoken LID system compared to the previous National Institute of Standards and Technology (NIST) Language Recognition Evaluation. Acoustic and phonetic speech information are targeted as the most suitable fea- tures for representing the characteristics of a language. To model the acoustic speech features a Gaussian Mixture Model based approach is employed. Pho- netic speech information is extracted using existing speech recognition technol- ogy. Various techniques to improve LID accuracy are also studied. One approach examined is the employment of Vocal Tract Length Normalization to reduce the speech variation caused by di®erent speakers. A linear data fusion technique is adopted to combine the various aspects of information extracted from speech. As a result of this research, a LID system was implemented and presented for evaluation in the 2003 Language Recognition Evaluation conducted by the NIST.
Resumo:
This paper discusses the effects of thyristor controlled series compensator (TCSC), a series FACTS controller, on the transient stability of a power system. Trajectory sensitivity analysis (TSA) has been used to measure the transient stability condition of the system. The TCSC is modeled by a variable capacitor, the value of which changes with the firing angle. It is shown that TSA can be used in the design of the controller. The optimal locations of the TCSC-controller for different fault conditions can also be identified with the help of TSA. The paper depicts the advantage of the use of TCSC with a suitable controller over fixed capacitor operation.
Resumo:
The paper discusses the operating principles and control characteristics of a dynamic voltage restorer (DVR) that protects sensitive but unbalanced and/or distorted loads. The main aim of the DVR is to regulate the voltage at the load terminal irrespective of sag/swell, distortion, or unbalance in the supply voltage. In this paper, the DVR is operated in such a fashion that it does not supply or absorb any active power during the steady-state operation. Hence, a DC capacitor rather than a DC source can supply the voltage source inverter realizing the DVR. The proposed DVR operation is verified through extensive digital computer simulation studies.
Resumo:
This paper presents a case study of a design for a complete microair vehicle thruster. Fixed-pitch small-scale rotors, brushless motors, lithium-polymer cells, and embedded control are combined to produce a mechanically simple, high-performance thruster with potentially high reliability. The custom rotor design requires a balance between manufacturing simplicity and rigidity of a blade versus its aerodynamic performance. An iterative steady-state aeroelastic simulator is used for holistic blade design. The aerodynamic load disturbances of the rotor-motor system in normal conditions are experimentally characterized. The motors require fast dynamic response for authoritative vehicle flight control. We detail a dynamic compensator that achieves satisfactory closed-loop response time. The experimental rotor-motor plant displayed satisfactory thrust performance and dynamic response.
Resumo:
One major gap in transportation system safety management is the ability to assess the safety ramifications of design changes for both new road projects and modifications to existing roads. To fulfill this need, FHWA and its many partners are developing a safety forecasting tool, the Interactive Highway Safety Design Model (IHSDM). The tool will be used by roadway design engineers, safety analysts, and planners throughout the United States. As such, the statistical models embedded in IHSDM will need to be able to forecast safety impacts under a wide range of roadway configurations and environmental conditions for a wide range of driver populations and will need to be able to capture elements of driving risk across states. One of the IHSDM algorithms developed by FHWA and its contractors is for forecasting accidents on rural road segments and rural intersections. The methodological approach is to use predictive models for specific base conditions, with traffic volume information as the sole explanatory variable for crashes, and then to apply regional or state calibration factors and accident modification factors (AMFs) to estimate the impact on accidents of geometric characteristics that differ from the base model conditions. In the majority of past approaches, AMFs are derived from parameter estimates associated with the explanatory variables. A recent study for FHWA used a multistate database to examine in detail the use of the algorithm with the base model-AMF approach and explored alternative base model forms as well as the use of full models that included nontraffic-related variables and other approaches to estimate AMFs. That research effort is reported. The results support the IHSDM methodology.
Resumo:
Component software has many benefits, most notably increased software re-use; however, the component software process places heavy burdens on programming language technology, which modern object-oriented programming languages do not address. In particular, software components require specifications that are both sufficiently expressive and sufficiently abstract, and, where possible, these specifications should be checked formally by the programming language. This dissertation presents a programming language called Mentok that provides two novel programming language features enabling improved specification of stateful component roles. Negotiable interfaces are interface types extended with protocols, and allow specification of changing method availability, including some patterns of out-calls and re-entrance. Type layers are extensions to module signatures that allow specification of abstract control flow constraints through the interfaces of a component-based application. Development of Mentok's unique language features included creation of MentokC, the Mentok compiler, and formalization of key properties of Mentok in mini-languages called MentokP and MentokL.