277 resultados para Standardisation
Resumo:
Respiration rates of 16 calanoid copepod species from the northern Benguela upwelling system were measured on board RRS Discovery in September/October 2010 to determine their energy requirements and assess their significance in the carbon cycle. Copepod species were sampled by different net types. Immediately after the hauls, samples were sorted to species and stages (16 species; females, males and C5 copepodids) according to Bradford-Grieve et al. (1999). Specimens were kept in temperature-controlled refrigerators for at least 12 h before they were used in experiments. Respiration rates of different copepod species were measured onboard by optode respirometry (for details see Köster et al., 2008) with a 10-channel optode respirometer (PreSens Precision Sensing Oxy-10 Mini, Regensburg, Germany) under simulated in situ conditions in temperature-controlled refrigerators. Experiments were run in gas-tight glass bottles (12-13 ml). For each set of experiments, two controls without animals were measured under exactly the same conditions to compensate for potential bias. The number of animals per bottle depended on the copepods size, stage and metabolic activity. Animals were not fed during the experiments but they showed natural species-specific movements. Immediately after the experiments, all specimens were deep-frozen at - 80 °C for later dry mass determination (after lyophilisation for 48 h) in the home lab. The carbon content (% of dry mass) of each species was measured by mass-spectrometry in association with stable isotope analysis and body dry mass was converted to units of carbon. For species without available carbon data, the mean value of all copepod species (44% dry mass) was applied. For the estimation of carbon requirements of copepod species, individual oxygen consumption rates were converted to carbon units, assuming that the expiration of 1 ml oxygen mobilises 0.44 mg of organic carbon by using a respiratory quotient (RQ) of 0.82 for a mixed diet consisting of proteins (RQ = 0.8-1.0), lipids (RQ = 0.7) and carbohydrates (RQ = 1.0) (Auel and Werner, 2003). The carbon ingestion rates were calculated using the energy budget and the potential maximum ingestion rate approach. To allow for physiological comparisons of respiration rates of deep- and shallow-living copepod species without the effects of ambient temperature and different individual body mass, individual respiration rates were temperature- (15°C, Q10=2) and size-adjusted. The scaling coefficient of 0.76 (R2=0.556) is used for the standardisation of body dry mass to 0.3 mg (mean dry mass of all analysed copepods), applying the allometric equation R= (R15°C/M0.76)×0.30.76, where R is respiration and M is individual dry mass in mg.
Resumo:
Motivated by the growing interest in unmanned aerial system's applications in indoor and outdoor settings and the standardisation of visual sensors as vehicle payload. This work presents a collision avoidance approach based on omnidirectional cameras that does not require the estimation of range between two platforms to resolve a collision encounter. It will achieve a minimum separation between the two vehicles involved by maximising the view-angle given by the omnidirectional sensor. Only visual information is used to achieve avoidance under a bearing-only visual servoing approach. We provide theoretical problem formulation, as well as results from real flight using small quadrotors
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
When the fresh fruit reaches the final markets from the suppliers, its quality is not always as good as it should, either because it has been mishandled during transportation or because it lacks an adequate quality control at the producer level, before being shipped. This is why it is necessary for the final markets to establish their own quality assessment system if they want to ensure to their customers the quality they want to sell. In this work, a system to control fruit quality at the last level of the distribution channel has been designed. The system combines rapid control techniques with laboratory equipment and statistical sampling protocols, to obtain a dynamic, objective process, which can substitute advantageously the quality control inspections carried out visually by human experts at the reception platform of most hypermarkets. Portable measuring equipment have been chosen (firmness tester, temperature and humidity sensors...) as well as easy-to-use laboratory equipment (texturometer, colorimeter, refractometer..,) combining them to control the most important fruit quality parameters (firmness, colour, sugars, acids). A complete computer network has been designed to control all the processes and store the collected data in real time, and to perform the computations. The sampling methods have been also defined to guarantee the confidence of the results. Some of the advantages of a quality assessment system as the proposed one are: the minimisation of human subjectivity, the ability to use modern measuring techniques, and the possibility of using it also as a supplier's quality control system. It can be also a way to clarify the quality limits of fruits among members of the commercial channel, as well as the first step in the standardisation of quality control procedures.
Resumo:
El desarrollo de la Ingeniería Civil en el siglo XXI debe estar dirigido a proporcionar de forma simultánea tanto las necesidades funcionales del proyecto como la conservación y sostenibilidad del territorio. Para que este proceso se pueda realizar de forma eficiente debe integrarse a los promotores (públicos y privados), a la administración y a los ciudadanos y sus asociaciones en el mecanismo de gestión y documentación del proyecto. Las directrices de la Unión Europea y el nuevo marco legislativo actual (Ley 21/2013) está orientado en esta dirección, pero las herramientas que estamos utilizando en este momento no cumplen adecuadamente estas necesidades. La norma UNE 157921:2006 y sucesivas debería renovarse contemplando tanto el nuevo marco legislativo como sobre todo la nueva realidad tecnológica para la gestión de la documentación técnica y científica mediante los lenguajes extensibles, la integración de bases de datos, las herramientas de participación social y las herramientas de protección y conservación del territorio todo ello a lo largo del ciclo de vida del proyecto En esta tesis vamos a presentar los trabajos que estamos realizando de análisis y propuesta de metodologías para la normalización de los informes de evaluación ambiental. que permita la gestión, documentación y participación social ABSTRACT The development of Civil Engineering in the XXI century should be leading towards a simultaneous sustenance of both the functional needs of the project, and the conservation and sustainability of the territory. Public and private promoters should collaborate with administration, citizens and their associations in the management mechanism and project documentation, in order to perform this process efficiently. The guidelines of the European Union and the current legislative framework (Law 21/2013) are oriented towards this, but the tools which are being used at the moment do not adequately meet the mentioned needs. The UNE 157921: 2006 and successive should be renewed in order to contemplate both the new legislative framework and the new technological reality for the management of technical and scientific documentation by the extensible languages, integration of databases, tools of social participation and tools protection and conservation of land all along the project life cycle. The analysis and proposal of different methodologies for the standardisation of environmental assessment reports, which admits for the correct management, documentation and social participation, will be presented in this paper.
Resumo:
Michelle Egan and Jacques Pelkmans provide an overview of the TBT chapter in TTIP and the various issues between the US and the EU in this area, which in turn requires extensive expositions of domestic regulation in the US and the EU. TBTs, outside heavily regulated sectors such as chemicals, automobiles or medicines (which have separate chapters in TTIP), can be caused by divergent (voluntary) standards, technical regulations and conformity assessment. Indeed, in all three the US and the EU have long experienced frictions with considerable trading costs. The 1998 Mutual Recognition Agreement about conformity assessment only succeeded in two out of six sectors. The US and European standardisation traditions differ and this paper explains why it is so hard, also economically, to realise convergence. However, the authors reject the unproductive ‘stand-off’ between US and EU negotiators on standardisation and suggest to clarify the enormous economic ‘installed base’ of prominent US standards in the world economy and build a solution from there. As to technical regulation, the prospect of converging regulation (via harmonisation) is often dim, but equivalence (given similar levels of regulatory protection) can be an option.
Resumo:
This paper reviews peer-to-peer (P2P) lending, its development in the UK and other countries, and assesses the business and economic policy issues surrounding this new form of intermediation. P2P platform technology allows direct matching of borrowers’ and lenders’ diversification over a large number of borrowers without the loans having to be held on an intermediary balance sheet. P2P lending has developed rapidly in both the US and the UK, but it still represents a small fraction, less than 1%, of the stock of bank lending. In the UK – but not elsewhere – it is an important source of loans for smaller companies. We argue that P2P lending is fundamentally complementary to, and not competitive with, conventional banking. We therefore expect banks to adapt to the emergence of P2P lending, either by cooperating closely with third-party P2P lending platforms or offering their own proprietary platforms. We also argue that the full development of the sector requires much further work addressing the risks and business and regulatory issues in P2P lending, including risk communication, orderly resolution of platform failure, control of liquidity risks and minimisation of fraud, security and operational risks. This will depend on developing reliable business processes, the promotion to the full extent possible of transparency and standardisation and appropriate regulation that serves the needs of customers.
Resumo:
Despite the standardisation of surgical techniques and significant progress in chemotherapeutics over the last 30 years, advanced epithelial ovarian cancer remains the most lethal gynaecological malignancy in the western world. Although the majority of women achieve a remission following primary therapy, most patients with advanced stage disease will eventually relapse and become candidates for 'salvage' therapy. The chances of a further remission depend on factors such as the 'treatment-free interval', and there are now a large number of chemotherapy agents with activity in ovarian cancer available to the oncologist. Recent randomised studies have reported on survival benefits for chemotherapy in recurrent disease, and therefore careful and appropriate selection of treatments has assumed a greater importance. This article reviews the most current data, and discusses the factors involved in making individualised treatment decisions.
Resumo:
In the three years to June 2005, 959 injuries associated with continuous miners (CMs), shuttle cars (SCs), load–haul–dump and personnel transport (PT) were reported by NSW underground coal mines, comprising 23% of all injuries reported. The present paper reports an analysis of the narrative field accompanying these reports to determine opportunities for controlling injury risks. The most common combinations of activity and mechanism were: strain while handling CM cable (96 injuries); caught between or struck by moving parts while bolting on a CM (86 injuries); strains while bolting on CM (54 injuries); and slipping off a CM during access, egress or other activity (60 injuries). For the other equipment considered, the common injury mechanism was the vehicle running over a pothole or other roadway abnormality causing the driver or passengers to be injured (169 injuries). Potential control measures include: monorails for CM services; hydraulic cable reelers; handrails on CM platforms; redesign of CM platforms and bolting rigs to reduce reach distances during drilling and bolting; improvements to guarding of bolting controls; standardisation and shape coding of bolting controls; two handed fast feed; improvements in underground roadway maintenance, vehicle suspension, visibility and seating; and pedestrian proximity warning devices.
Resumo:
This paper reports the initial results of a joint research project carried out by Aston University and Lloyd's Register to develop a practical method of assessing neural network applications. A set of assessment guidelines for neural network applications were developed and tested on two applications. These case studies showed that it is practical to assess neural networks in a statistical pattern recognition framework. However there is need for more standardisation in neural network technology and a wider takeup of good development practice amongst the neural network community.
Resumo:
Purpose - To provide a framework of accounting policy choice associated with the timing of adoption of the UK Statement of Standard Accounting Practice (SSAP) No. 20, "Foreign Currency Translation". The conceptual framework describes the accounting policy choices that firms face in a setting that is influenced by: their financial characteristics; the flexible foreign exchange rates; and the stock market response to accounting decisions. Design/methodology/approach - Following the positive accounting theory context, this paper puts into a framework the motives and choices of UK firms with regard to the adoption or deferment of the adoption of SSAP 20. The paper utilises the theoretical and empirical findings of previous studies to form and substantiate the conceptual framework. Given the UK foreign exchange setting, the framework identifies the initial stage: lack of regulation and flexibility in financial reporting; the intermediate stage: accounting policy choice; and the final stage: accounting choice and policy review. Findings - There are situations where accounting regulation contrasts with the needs and business objectives of firms and vice-versa. Thus, firms may delay the adoption up to the point where the increase in political costs can just be tolerated. Overall, the study infers that firms might have chosen to defer the adoption of SSAP 20 until they reach a certain corporate goal, or the adverse impact (if any) of the accounting change on firms' financial numbers is minimal. Thus, the determination of the timing of the adoption is a matter which is subject to the objectives of the managers in association with the market and economic conditions. The paper suggests that the flexibility in financial reporting, which may enhance the scope for income-smoothing, can be mitigated by the appropriate standardisation of accounting practice. Research limitations/implications - First, the study encompassed a period when firms and investors were less sophisticated users of financial information. Second, it is difficult to ascertain the decisions that firms would have taken, had the pound appreciated over the period of adoption and had the firms incurred translation losses rather than translation gains. Originality/value - This paper is useful to accounting standards setters, professional accountants, academics and investors. The study can give the accounting standard-setting bodies useful information when they prepare a change in the accounting regulation or set an appropriate date for the implementation of an accounting standard. The paper provides significant insight about the behaviour of firms and the associated impacts of financial markets and regulation on the decision-making process of firms. The framework aims to assist the market and other authorities to reduce information asymmetry and to reinforce the efficiency of the market. © Emerald Group Publishing Limited.
Resumo:
Investigates the degree of global standardisation of a corporate visual identity system (CVIS) in multinational operations. A special emphasis of this research is accorded to UK companies operating in Malaysia. In particular, the study seeks to reveal the reasons for developing a standardised CVIS; the behavioural issues associated with CVIS; and the determination in selecting a graphic design agency. The findings of the research revealed that multinational corporations in an increasingly corporate environment adopted a standardised CVIS for several reasons, including, aiding the sale of products and services, creating an attractive environment for hiring employees, and increasing the company’s stature and presence. Further findings show that the interest in global identity was stimulated by global restructuring, merger or acquisition. The above trends help explain why increased focus has been accorded to CVIS over the past five years by many UK companies operating in Malaysia. Additional findings reveal that both the UK design agencies and in-house design department are used in the development of the firms’ CVIS.
Resumo:
The advent of the Integrated Services Digital Network (ISDN) led to the standardisation of the first video codecs for interpersonal video communications, followed closely by the development of standards for the compression, storage and distribution of digital video in the PC environment, mainly targeted at CD-ROM storage. At the same time the second-generation digital wireless networks, and the third-generation networks being developed, have enough bandwidth to support digital video services. The radio propagation medium is a difficult environment in which to deploy low bit error rate, real time services such as video. The video coding standards designed for ISDN and storage applications, were targeted at low bit error rate levels, orders of magnitude lower than the typical bit error rates experienced on wireless networks. This thesis is concerned with the transmission of digital, compressed video over wireless networks. It investigates the behaviour of motion compensated, hybrid interframe DPCM/DCT video coding algorithms, which form the basis of current coding algorithms, in the presence of high bit error rates commonly found on digital wireless networks. A group of video codecs, based on the ITU-T H.261 standard, are developed which are robust to the burst errors experienced on radio channels. The radio link is simulated at low level, to generate typical error files that closely model real world situations, in a Rayleigh fading environment perturbed by co-channel interference, and on frequency selective channels which introduce inter symbol interference. Typical anti-multipath techniques, such as antenna diversity, are deployed to mitigate the effects of the channel. Link layer error control techniques are also investigated.
Resumo:
This text is concerned with the intellectual and social alienation experienced by a twentieth century German writer (1906 - ).·the alienation begins in the context of German society, but this context is later globalised. The thesis first discusses the social and· intellectual origins and the salient features of this alienated stance, before proceeding to a detailed analysis of its recurring symptoms and later intensification in each of the author's main works, chronologically surveyed, supported by reference to minor writings. From the novels of the thirties' showing the burgher-artist conflict, and its symbolic dichotomies, the renunciation of traditional German values, and the ambiguous confrontation with new disruptive socio-political forces, we move to the post-war trilogy (1951-54), with its roots in the German social and political experience of the thirties' onwards. The latter, however, is merely a background for the presentation of a much more comprehensive view of the human condition:- a pessimistic vision of the repetitiveness and incorrigibility of this condition, the possibility of the apocalypse, the bankruptcy and ineffectiveness of European religion and culture, the 'absurd' meaninglessness of history, the intellectual artist's position and role(s) in mass-culture and an abstract, technologised mass-society, the central theme of fragmentation - of the structure of reality, society and personality, the artist's relation to this fragmentation, intensified in the twentieth,century. Style and language are consonant with this world-picture. Many of these features recur in the travel-books (1958-61); diachronic as well as synchronic approaches characterise the presentation of various modes of contemporary society in America, Russia, France and other European countries. Important features of intellectual alienation are:- the changelessness of historical motifs (e.g. tyranny, aggression), the conventions of burgher society, both old and new forms, the qualitative depreciation and standardisation of living, industrialisation and technology in complex, vulnerable and concemtrated urban societies, ambiguities of fragmented pluralism. Reference is made .to other travel-writers.
Resumo:
The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.