961 resultados para number of patent applications
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
[EN]This paper deals with the orthogonal projection (in the Frobenius sense) AN of the identity matrix I onto the matrix subspace AS (A ? Rn×n, S being an arbitrary subspace of Rn×n). Lower and upper bounds on the normalized Frobenius condition number of matrix AN are given. Furthermore, for every matrix subspace S ? Rn×n, a new index bF (A, S), which generalizes the normalized Frobenius condition number of matrix A, is defined and analyzed...
Resumo:
In this investigation I look at patents and software agents as a way to study broader relation between law and science (the latter term broadly understood as inclusive of science and technology). The overall premise framing the entire discussion, my basic thesis, is that this relation, between law and science, cannot be understood without taking into account a number of intervening factors identifying which makes it necessary to approach the question from the standpoint of fields and disciplines other than law and science themselves.
Resumo:
This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.
Resumo:
Background Several studies have shown an association of cryptogenic stroke and embolism with patent foramen ovale (PFO), but the question how to prevent further events in such patients is unresolved. Options include antithrombotic treatment with warfarin or antiplatelet agents or surgical or endovascular closure of the PFO. The PC-Trial was set up to compare endovascular closure and best medical treatment for prevention of recurrent events. Methods The PC-Trial is a randomized clinical trial comparing the efficacy of percutaneous closure of the PFO using the Amplatzer PFO occluder with best medical treatment in patients with cryptogenic embolism, i.e. mostly cryptogenic stroke. Warfarin for 6 months followed by antiplatelet agents is recommended as medical treatment. Randomization is stratified according to patients age (<45 versus ≥45 years), presence of atrial septal aneurysm (ASA yes or no) and number of embolic events before randomization (one versus more than one event). Primary endpoints are death, nonfatal stroke and peripheral embolism. Discussion patients were randomized in 29 centers of Europe, Canada, and Australia. Randomization started February 2000. Enrollment of 414 patients was completed in February 2009. All patients will be followed-up longitudinally. Follow-up is maintained until the last enrolled patient is beyond 2.5 years of follow-up (expected in 2011).
Resumo:
The suspected cause of clinical manifestations of patent foramen ovale (PFO) is a transient or a permanent right-to-left shunt (RLS). Contrast-enhanced transcranial Doppler ultrasound (c-TCD) is a reliable alternative to transesophageal echocardiography (TEE) for diagnosis of PFO, and enables also the detection of extracardiac RLS. The air-containing echo contrast agents are injected intravenously and do not pass the pulmonary circulation. In the presence of RLS, the contrast agents bypass the pulmonary circulation and cause microembolic signals (MES) in the basal cerebral arteries, which are detected by TCD. The two main echo contrast agents in use are agitated saline and D-galactose microparticle solutions. At least one middle cerebral artery (MCA) is insonated, and the ultrasound probe is fixed with a headframe. The monitored Doppler spectra are stored for offline analysis (e.g., videotape) of the time of occurrence and number of MES, which are used to assess the size and functional relevance of the RLS. The examination is more sensitive, if both MCAs are investigated. In the case of negative testing, the examination is repeated using the Valsalva maneuver. Compared to TEE, c-TCD is more comfortable for the patient, enables an easier assessment of the size and functional relevance of the RLS, and allows also the detection of extracardiac RLS. However, c-TCD cannot localize the site of the RLS. Therefore, TEE and TCD are complementary methods and should be applied jointly in order to increase the diagnostic accuracy for detecting PFO and other types of RLS.
Resumo:
BACKGROUND The options for secondary prevention of cryptogenic embolism in patients with patent foramen ovale are administration of antithrombotic medications or percutaneous closure of the patent foramen ovale. We investigated whether closure is superior to medical therapy. METHODS We performed a multicenter, superiority trial in 29 centers in Europe, Canada, Brazil, and Australia in which the assessors of end points were unaware of the study-group assignments. Patients with a patent foramen ovale and ischemic stroke, transient ischemic attack (TIA), or a peripheral thromboembolic event were randomly assigned to undergo closure of the patent foramen ovale with the Amplatzer PFO Occluder or to receive medical therapy. The primary end point was a composite of death, nonfatal stroke, TIA, or peripheral embolism. Analysis was performed on data for the intention-to-treat population. RESULTS The mean duration of follow-up was 4.1 years in the closure group and 4.0 years in the medical-therapy group. The primary end point occurred in 7 of the 204 patients (3.4%) in the closure group and in 11 of the 210 patients (5.2%) in the medical-therapy group (hazard ratio for closure vs. medical therapy, 0.63; 95% confidence interval [CI], 0.24 to 1.62; P=0.34). Nonfatal stroke occurred in 1 patient (0.5%) in the closure group and 5 patients (2.4%) in the medical-therapy group (hazard ratio, 0.20; 95% CI, 0.02 to 1.72; P=0.14), and TIA occurred in 5 patients (2.5%) and 7 patients (3.3%), respectively (hazard ratio, 0.71; 95% CI, 0.23 to 2.24; P=0.56). CONCLUSIONS Closure of a patent foramen ovale for secondary prevention of cryptogenic embolism did not result in a significant reduction in the risk of recurrent embolic events or death as compared with medical therapy. (Funded by St. Jude Medical; ClinicalTrials.gov number, NCT00166257.).
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
Who invents medicines for the poor of the world? This question becomes very important where the WTO allows low income countries to be unbound by the TRIPS agreement. This agreement concerns medicines for infectious diseases such as HIV/AIDS, tuberculosis and malaria. These diseases cause serious damage to low income countries. Under these circumstances, some scholars wonder if anyone will continue innovative activities related to treating these diseases. This paper sought to answer this question by collecting and analyzing patent data of medicines and vaccines for diseases using the database of the Japan Patent Office. Results indicate that private firms have led in innovation not only for global diseases such as HIV/AIDS but also diseases such as malaria that are spreading exclusively in low income countries. Innovation for the three infectious diseases is diverse among firms, and frequent patent applications by high-performing pharmaceutical firms appear prominent even after R&D expenditure, economies of scale, and economies of scope are taken into account.
Resumo:
In this paper, we describe a complete development platform that features different innovative acceleration strategies, not included in any other current platform, that simplify and speed up the definition of the different elements required to design a spoken dialog service. The proposed accelerations are mainly based on using the information from the backend database schema and contents, as well as cumulative information produced throughout the different steps in the design. Thanks to these accelerations, the interaction between the designer and the platform is improved, and in most cases the design is reduced to simple confirmations of the “proposals” that the platform dynamically provides at each step. In addition, the platform provides several other accelerations such as configurable templates that can be used to define the different tasks in the service or the dialogs to obtain or show information to the user, automatic proposals for the best way to request slot contents from the user (i.e. using mixed-initiative forms or directed forms), an assistant that offers the set of more probable actions required to complete the definition of the different tasks in the application, or another assistant for solving specific modality details such as confirmations of user answers or how to present them the lists of retrieved results after querying the backend database. Additionally, the platform also allows the creation of speech grammars and prompts, database access functions, and the possibility of using mixed initiative and over-answering dialogs. In the paper we also describe in detail each assistant in the platform, emphasizing the different kind of methodologies followed to facilitate the design process at each one. Finally, we describe the results obtained in both a subjective and an objective evaluation with different designers that confirm the viability, usefulness, and functionality of the proposed accelerations. Thanks to the accelerations, the design time is reduced in more than 56% and the number of keystrokes by 84%.
Resumo:
A high productivity rate in Engineering is related to an efficient management of the flow of the large quantities of information and associated decision making activities that are consubstantial to the Engineering processes both in design and production contexts. Dealing with such problems from an integrated point of view and mimicking real scenarios is not given much attention in Engineering degrees. In the context of Engineering Education, there are a number of courses designed for developing specific competencies, as required by the academic curricula, but not that many in which integration competencies are the main target. In this paper, a course devoted to that aim is discussed. The course is taught in a Marine Engineering degree but the philosophy could be used in any Engineering field. All the lessons are given in a computer room in which every student can use each all the treated software applications. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using Ms-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity, ending up with the case of vessel construction. The second part of the course is dedicated to the use of a database manager, Ms-ACCESS, for managing production related information. A series of increasing complexity examples is treated ending up with the management of the pipe database of a real vessel. This database consists of a few thousand of pipes, for which a production timing frame is defined, which connects this part of the course with the first one. Finally, the third part of the course is devoted to the work with FORAN, an Engineering Production package of widespread use in the shipbuilding industry. With this package, the frames and plates where all the outfitting will be carried out are defined through cooperative work by the studens, working simultaneously in the same 3D model. In the paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feed-back from their experience as well as to assess their satisfaction with the learning process. Results from these surveys are discussed in the paper
Resumo:
Due to its small band-gap and its high mobility, InN is a promising material for a large number of key applications like band-gap engineering for high efficiency solar cells, light emitting diodes, and high speed devices. Unfortunately, it has been reported that this material exhibits strong surface charge accumulation which may depend on the type of surface. Current investigations are conducted in order to explain the mechanisms which govern such a behavior and to look for ways of avoiding it and/or finding applications that may use such an effect. In this framework, low frequency noise measurements have been performed at different temperatures on patterned MBE grown InN layers. The evolution of the 1/f noise level with temperature in the 77 K-300 K range is consistent with carrier number fluctuations thus indicating surface mechanisms: the surface charge accumulation is confirmed by the noise measurements.
Resumo:
Since the decade of the 1980’s the literature on economic development began paying attention to the cases of countries which were industrialized after the first industrial revolution. One of the most relevant aspects analyzed has been the role of technology as a factor which promotes or delays the process of catching up with technology leaders. As result of this interest, new and more adequate indicators were identified to provide a coherent explanation for technological activities and their relationship with economic efficiency. Although the earliest studies focused on analyzing the activities of research and development (R&D), recently the focus of analysis has shifted to another type of variables, more oriented towards the processes of innovation and the gathering of knowledge and capabilities, in which patents provide relevant information.