870 resultados para Patentability Requirements
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação
Resumo:
The use of construction and demolition waste (C&DW) in the construction industry is an important contribution to attain sustainability in the sector. The roads are among the civil engineering works which can use larger quantities of C&DW recycled aggregates. In Portugal, the limit values for the properties of C&DW recycled aggregates that can be used in the roads of Portuguese Road Network are defined by two Laboratório Nacional de Engenharia Civil (LNEC) technical specifications (TS), in accordance to Portuguese Decree-law no. 46/2008 of May 12th. Municipal and rural roads and trenches have specific characteristics that can enable the use of C&DW of lower quality than those required by existing LNEC TS, and even then ensuring an adequate performance. However, given the absence of specific regulation for those applications, the Portuguese Environment Agency requires compliance with the existing LNEC TS, which represents an obstacle to recycling a significant part of the C&DW, in particular at a local government level. This paper presents guidelines for the recycling of C&DW in municipal and rural roads and in trenches, which could be considered in a new forthcoming LNEC TS. In the preparation of the guidelines, the bibliography collected and analysed, the information gathered from the application of C&DW in a municipal and rural roads of a Portuguese municipality and in the roadways of a Portuguese resort, and the results of laboratory tests carried out on samples collected in the Portuguese municipality were taken into consideration.
Resumo:
The paper reflects the work of COST Action TU1403 Workgroup 3/Task group 1. The aim is to identify research needs from a review of the state of the art of three aspects related to adaptive façade systems: (1) dynamic performance requirements; (2) façade design under stochastic boundary conditions and (3) experiences with adaptive façade systems and market needs.
Resumo:
Apresentação realizada em 23 outubro de 2015 no: National Workshop for Open Access: «Open Access to research publications & data», Nicosia, Chipre. Também acessível em: http://hdl.handle.net/10797/14460
Resumo:
Although the ASP model has been around for over a decade, it has not achieved the expected high level of market uptake. This research project examines the past and present state of ASP adoption and identifies security as a primary factor influencing the uptake of the model. The early chapters of this document examine the ASP model and ASP security in particular. Specifically, the literature and technology review chapter analyses ASP literature, security technologies and best practices with respect to system security in general. Based on this investigation, a prototype to illustrate the range and types of technologies that encompass a security framework was developed and is described in detail. The latter chapters of this document evaluate the practical implementation of system security in an ASP environment. Finally, this document outlines the research outputs, including the conclusions drawn and recommendations with respect to system security in an ASP environment. The primary research output is the recommendation that by following best practices with respect to security, an ASP application can provide the same level of security one would expect from any other n-tier client-server application. In addition, a security evaluation matrix, which could be used to evaluate not only the security of ASP applications but the security of any n-tier application, was developed by the author. This thesis shows that perceptions with regard to fears of inadequate security of ASP solutions and solution data are misguided. Finally, based on the research conducted, the author recommends that ASP solutions should be developed and deployed on tried, tested and trusted infrastructure. Existing Application Programming Interfaces (APIs) should be used where possible and security best practices should be adhered to where feasible.
Resumo:
v. 1
Resumo:
Chrysomya albiceps specimens were obtained from colonies established with larvae and adults collected at the Federal Rural University in Rio de Janeiro, Seropédica, State of Rio de Janeiro. The larval stage of C. albiceps was allowed to develop in climatic chambers at temperatures of 18, 22, 27 and 32ºC, and the pupal stage was allowed to develop at 22, 27 and 32ºC (60 ± 10% RH and 14 hr photoperiod). The duration and viability of the larval stage of C. albiceps at 18, 22, 27 and 32ºC were 21.30, 10.61, 5.0 and 4.0 days and 76.5, 88.5, 98.5 and 99.5%, respectively, with mean mature larval weights of 45.16, 81.86, 84.35 and 70.53 mg, respectively. Mean duration and viability of the pupal stage at 22, 27 and 32ºC were 9.36, 4.7 and 3.0 days and 93.8, 100 and 100%, respectively. The basal temperature for the larval and pupal stage and for the larval and adult phase were 15.04, 17.39 and 15.38ºC, corresponding to 65.67, 44.15 and 114.23 DD.
Resumo:
The report presents a grammar capable of analyzing the process of production of electricity in modular elements for different power-supply systems, defined using semantic and formal categories. In this way it becomes possible to individuate similarities and differences in the process of production of electricity, and then measure and compare “apples” with “apples” and “oranges” with “oranges”. For instance, when comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. As a matter of facts, the performance of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. By adopting this approach, it becomes possible to compare the performance of the two power-supply systems by comparing their relative biophysical requirements for the phases that both nuclear energy power plants and fossil energy power plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. This report presents the evaluation of the biophysical requirements for the two powersupply systems: nuclear energy and fossil energy. In particular, the report focuses on the following requirements: (i) electricity; (ii) fossil-fuels, (iii) labor; and (iv) materials.
Resumo:
Gim & Kim (1998) proposed a generalization of Jeong (1982, 1984) reinterpretation of the Hawkins-Simon condition for macroeconomic stability to off-diagonal matrix elements. This generalization is conceptually relevant for it offers a complementary view of interindustry linkages beyond final or net output influence. The extension is completely similar to the 'total flow' idea introduced by Szyrmer (1992) or the 'output-to-output' multiplier of Miller & Blair (2009). However the practical implementation of Gim & Kim is actually faulty since it confuses the appropriate order of output normalization. We provide a new and elementary solution for the correct formalization using standard interindustry accounting concepts.
Resumo:
OBJECTIVE: To assess the change in non-compliant items in prescription orders following the implementation of a computerized physician order entry (CPOE) system named PreDiMed. SETTING: The department of internal medicine (39 and 38 beds) in two regional hospitals in Canton Vaud, Switzerland. METHOD: The prescription lines in 100 pre- and 100 post-implementation patients' files were classified according to three modes of administration (medicines for oral or other non-parenteral uses; medicines administered parenterally or via nasogastric tube; pro re nata (PRN), as needed) and analyzed for a number of relevant variables constitutive of medical prescriptions. MAIN OUTCOME MEASURE: The monitored variables depended on the pharmaceutical category and included mainly name of medicine, pharmaceutical form, posology and route of administration, diluting solution, flow rate and identification of prescriber. RESULTS: In 2,099 prescription lines, the total number of non-compliant items was 2,265 before CPOE implementation, or 1.079 non-compliant items per line. Two-thirds of these were due to missing information, and the remaining third to incomplete information. In 2,074 prescription lines post-CPOE implementation, the number of non-compliant items had decreased to 221, or 0.107 non-compliant item per line, a dramatic 10-fold decrease (chi(2) = 4615; P < 10(-6)). Limitations of the computerized system were the risk for erroneous items in some non-prefilled fields and ambiguity due to a field with doses shown on commercial products. CONCLUSION: The deployment of PreDiMed in two departments of internal medicine has led to a major improvement in formal aspects of physicians' prescriptions. Some limitations of the first version of PreDiMed were unveiled and are being corrected.
Resumo:
We propose a charging scheme for cost distribution along a multicast tree when cost is the responsibility of the receivers. This scheme focuses on QoS considerations and it does not depend on any specific type of service. The scheme has been designed to be used as a bridge between unicast and multicast services, solving the problem of charging multicast services by means of unicast charging and existing QoS routing mechanisms. We also include a numerical comparison and discussions of the case of non-numerical or relative QoS and on the application to some service examples in order to give a better understanding of the proposal
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
Primary care physicians have to assess visual functions essential for driving when determining medical fitness to drive. However, it can be difficult to apply the legal requirements that are described in annex 1 of the ordinance regulating the admission to road traffic of 1976 (OAC) due to lack of unambiguousness. This article discusses those visual functions that have to be assessed namely visual acuity, the visual field and the detection of diplopia and it presents the appropriate methods for the primary care setting. Another objective is to discuss the relevance of road safety requirements on vision and to present the new Swiss requirements proposed for the future in comparison to some international recommendations.