870 resultados para Time Based Management (TBM)
Resumo:
Many audio watermarking schemes divide the audio signal into several blocks such that part of the watermark is embedded into each of them. One of the key issues in these block-oriented watermarking schemes is to preserve the synchronisation, i.e. to recover the exact position of each block in the mark recovery process. In this paper, a novel time domain synchronisation technique is presented together with a new blind watermarking scheme which works in the Discrete Fourier Transform (DFT or FFT) domain. The combined scheme provides excellent imperceptibility results whilst achieving robustness against typical attacks. Furthermore, the execution of the scheme is fast enough to be used in real-time applications. The excellent transparency of the embedding algorithm makes it particularly useful for professional applications, such as the embedding of monitoring information in broadcast signals. The scheme is also compared with some recent results of the literature.
Resumo:
Open source is typically outside of normal commercial software procurement processes.The Challenges.Increasingly diverse and distributed set of development resources.Little/no visibility into the origins of the software.Supply Chain Comparison: Hardware vs Software.Open source has revolutionized the mobile and device landscape, other industries will follow.Supply chain management techniques from hardware are useful for managing software.SPDX A standard format for communicating a software Bill of Materials across the supply chain.Effective management and control requires training, tools, processes and standards.
Resumo:
The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.
Resumo:
This correspondence studies the formulation of members ofthe Cohen-Posch class of positive time-frequency energy distributions.Minimization of cross-entropy measures with respect to different priorsand the case of no prior or maximum entropy were considered. It isconcluded that, in general, the information provided by the classicalmarginal constraints is very limited, and thus, the final distributionheavily depends on the prior distribution. To overcome this limitation,joint time and frequency marginals are derived based on a "directioninvariance" criterion on the time-frequency plane that are directly relatedto the fractional Fourier transform.
Resumo:
Reaaliaikainen, ennakoiva kunnonvalvonta on erittäin tärkeä osa modernin tehtaan tai tuotantolinjan toimintaa. Diplomityön teettäjä haluaa edelleen kehittää akustiseen emissioon perustuvaa kunnonvalvonta järjestelmäänsä, jotta siitä olisi enemmän hyötyä asiakkaalle. Diplomityö sisältää johdannonakustiseen emissioon ja akustisiin emissio sensoreihin. Työn tavoitteena oli kehittää päätöksentekojärjestelmä, jota käytettäisiin työn teettäjän valmistamien sensoreiden antaman tiedon automatisoituun analysointiin. Työssä on vertailtu kolmea eri ohjelmistotoimittajaa ja heidän ohjelmiaan, ja tehty ehdotus hankittavasta ohjelmistosta. Lisäksi työssä on kehitetty ohjeita, joiden avulla ohjelmisto ohjelmoidaan tuottamaan reaaliaikaista tietoa ja huolto-ohjeita sen käyttäjille. Lisäksi työssä annetaan ehdotuksia kunnonvalvonta- ja päätöksentekojärjestelmän edelleen kehittämiseen.
Resumo:
Printed electronics is an emerging concept in electronics manufacturing and it is in very early development stage. The technology is not stable, design kits are not developed, and flows and Computer Aided Design (CAD) tools are not fixed yet. The European project TDK4PE addresses all this issues and this PFC has been realized on this context. The goal is to develop an XML-based information system for the collection and management of information from the technology and cell libraries developed in TDK4PE. This system will ease the treatment of that information for a later generation of specific Design Kits (DK) and the corresponding documentation. This work proposes a web application to generate technology files and design kits in a formatted way; it also proposes a structure for them and a database implementation for storing the needed information. The application will allow its users to redefine the structure of those files, as well as export and import XML files, between other formats.
Resumo:
Résumé Contexte: Bon nombre d'études épidémiologiques concernant les premières crises comitiales ont été effectuées principalement sur des populations générales. Cependant, les patients admis dans un hôpital peuvent présenter des éléments cliniques différents. Nous avons donc mené une étude prospective auprès de sujets dans une population hospitalière ayant subi une première crise d'épilepsie, afin d'étudier leur pronostic et le rôle des examens complémentaires (examen neurologique, imagerie cérébrale, examens sanguins, EEG) dans le choix de l'administration d'une médication antiépileptique. Méthodes : Sur une période d'une année, nous avons suivi 177 patients adultes, admis consécutivement, ayant présenté une crise d'épilepsie dont l'évaluation aiguë a été effectuée dans notre hôpital. Pendant 6 mois, nous avons pratiqué pour chaque patient un suivi du traitement antiépileptique, des récidives de crises et d'un éventuel décès. Résultats : L'examen neurologique était anormal dans 72.3% des cas, l'imagerie cérébrale dans 54.8% et les examens sanguins dans 57.1%. L'EEG a montré des éléments épileptiformes dans 33.9% des cas. L'étiologie la plus fréquemment représentée était constituée par des intoxications. Un traitement antiépileptique a été prescrit chez 51% des patients. 31.6% des sujets suivis à six mois ont subi une récidive ; la mortalité s'est élevée à 17.8%. Statistiquement, l'imagerie cérébrale, l'EEG et l'examen neurologique étaient des facteurs prédictifs indépendants pour l'administration d'antiépileptiques, et l'imagerie cérébrale le seul facteur associé au pronostic. Conclusions : Les patients évalués en aigu dans un hôpital pour une première crise comitiale présentent un profil médical sous-jacent, qui explique probablement leur mauvais pronostic. L'imagerie cérébrale s'est avérée être le test paraclinique le plus important dans la prévention du traitement et du pronostic. Mots-clés : première crise d'épilepsie, étiologie, pronostic, récidive, médication antiépileptique, population hospitalière Summary Background: Epidemiological studies focusing on first-ever seizures have been carried out mainly on community based populations. However, since hospital populations may display varying clinical features, we prospectively analysed patients with first-ever seizure in a hospital based community to evaluate prognosis and the role of complementary investigations in the decision to administer antiepileptic drugs (AED). Methods: Over one year, we recruited 177 consecutive adult patients with a first seizure acutely evaluated in our hospital. During six months' follow-up data relating to AED treatment, recurrence of seizures and death were collected for each patient. Results:. Neurological examination was abnormal in 72.3%, neuroimaging in 54.8% and biochemical tests in 57.1%. Electroencephalogram (EEG) showed epileptiform features in 33.9%. Toxicity represented the most common aetiology. AED was prescribed in 51% of patients. Seizure recurrence at six months involved 31.6% of patients completing the follow-up; mortality was 17.8%. Statistical analysis showed that brain CT, EEG and neurological examination are independent predictive factors for AED administration, but only CT scan is associated with outcome. Conclusions: Patients evaluated acutely for first- ever seizure in a hospital setting have severe underlying clinical conditions apparently related to their relatively poor prognosis. Neuroimaging represents the most important paraclinical test in predicting both treatment administration and outcome.
Resumo:
Thisthesis supplements the systematic approach to competitive intelligence and competitor analysis by introducing an information-processing perspective on management of the competitive environment and competitors therein. The cognitive questions connected to the intelligence process and also the means that organizational actors use in sharing information are discussed. The ultimate aim has been to deepen knowledge of the different intraorganizational processes that are used in acorporate organization to manage and exploit the vast amount of competitor information that is received from the environment. Competitor information and competitive knowledge management is examined as a process, where organizational actorsidentify and perceive the competitive environment by using cognitive simplification, make interpretations resulting in learning and finally utilize competitor information and competitive knowledge in their work processes. The sharing of competitive information and competitive knowledge is facilitated by intraorganizational networks that evolve as a means of developing a shared, organizational level knowledge structure and ensuring that the right information is in the right place at the right time. This thesis approaches competitor information and competitive knowledge management both theoretically and empirically. Based on the conceptual framework developed by theoretical elaboration, further understanding of the studied phenomena is sought by an empirical study. The empirical research was carried out in a multinationally operating forest industry company. This thesis makes some preliminary suggestions of improving the competitive intelligence process. It is concluded that managing competitor information and competitive knowledge is not simply a question of managing information flow or improving sophistication of competitor analysis, but the crucial question to be solved is rather, how to improve the cognitive capabilities connected to identifying and making interpretations of the competitive environment and how to increase learning. It is claimed that competitive intelligence can not be treated like an organizational function or assigned solely to a specialized intelligence unit.
Resumo:
There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.
Resumo:
In order to improve the management of copyright in the Internet, known as Digital Rights Management, there is the need for a shared language for copyright representation. Current approaches are based on purely syntactic solutions, i.e. a grammar that defines a rights expression language. These languages are difficult to put into practise due to the lack of explicit semantics that facilitate its implementation. Moreover, they are simple from the legal point of view because they are intended just to model the usage licenses granted by content providers to end-users. Thus, they ignore the copyright framework that lies behind and the whole value chain from creators to end-users. Our proposal is to use a semantic approach based on semantic web ontologies. We detail the development of a copyright ontology in order to put this approach into practice. It models the copyright core concepts for creation, rights and the basic kinds of actions that operate on content. Altogether, it allows building a copyright framework for the complete value chain. The set of actions operating on content are our smaller building blocks in order to cope with the complexity of copyright value chains and statements and, at the same time, guarantee a high level of interoperability and evolvability. The resulting copyright modelling framework is flexible and complete enough to model many copyright scenarios, not just those related to the economic exploitation of content. The ontology also includes moral rights, so it is possible to model this kind of situations as it is shown in the included example model for a withdrawal scenario. Finally, the ontology design and the selection of tools result in a straightforward implementation. Description Logic reasoners are used for license checking and retrieval. Rights are modelled as classes of actions, action patterns are modelled also as classes and the same is done for concrete actions. Then, to check if some right or license grants an action is reduced to check for class subsumption, which is a direct functionality of these reasoners.
Resumo:
This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.
Resumo:
Zonal management in vineyards requires the prior delineation of stable yield zones within the parcel. Among the different methodologies used for zone delineation, cluster analysis of yield data from several years is one of the possibilities cited in scientific literature. However, there exist reasonable doubts concerning the cluster algorithm to be used and the number of zones that have to be delineated within a field. In this paper two different cluster algorithms have been compared (k-means and fuzzy c-means) using the grape yield data corresponding to three successive years (2002, 2003 and 2004), for a ‘Pinot Noir’ vineyard parcel. Final choice of the most recommendable algorithm has been linked to obtaining a stable pattern of spatial yield distribution and to allowing for the delineation of compact and average sized areas. The general recommendation is to use reclassified maps of two clusters or yield classes (low yield zone and high yield zone) and, consequently, the site-specific vineyard management should be based on the prior delineation of just two different zones or sub-parcels. The two tested algorithms are good options for this purpose. However, the fuzzy c-means algorithm allows for a better zoning of the parcel, forming more compact areas and with more equilibrated zonal differences over time.
Resumo:
Enhanced Recovery After Surgery (ERAS) is a multimodal, standardized and evidence-based perioperative care pathway. With ERAS, postoperative complications are significantly lowered, and, as a secondary effect, length of hospital stay and health cost are reduced. The patient recovers better and faster allowing to reduce in addition the workload of healthcare providers. Despite the hospital discharge occurs sooner, there is no increased charge of the outpatient care. ERAS can be safely applied to any patient by a tailored approach. The general practitioner plays an essential role in ERAS by assuring the continuity of the information and the follow-up of the patient.
Resumo:
To achieve success in a constantly changing environment and with ever-increasing competition, companies must develop their operations continuously. To do this, they must have a clear vision of what they want to be in the future. This vision can be attained through careful planning and strategising. One method of transforming a strategy and vision into an everyday tool used by employees is the use of a balanced performance measurement system. The importance of performance measurement in the implementation of companies' visions and strategies has grown substantially in the last ten years. Measures are derived from the company's critical success factors and from many different perspectives. There are three time dimensions: past, present and future. Many such performance measurement systems have been created since the 1990s. This is a case study whose main objective is to provide a recommendation for how the case company could make use of performance measurement to support strategic management. To answer this question, the study uses literature-based research and empirical research at the case company's premises. The theoretical part of the study consists of two sections: introducing the Balanced Scorecard and discussing how it supports strategic management and change management. The empirical part of this study determines the company's present performance measurement situation through interviews in the company. The study resulted in a recommendation to the company to start developing the Balanced Scorecard system. By setting up this kind process, the company would be able to change its focus more towards the future, beginning to implement a more process-based organisation and getting its employees to work together towards common goals.