961 resultados para number of patent applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The regional economic impact of biofuel production depends upon a number of interrelated factors: the specific biofuels feedstock and production technology employed; the sector’s embeddedness to the rest of the economy, through its demand for local resources; the extent to which new activity is created. These issues can be analysed using multisectoral economic models. Some studies have used (fixed price) Input-Output (IO) and Social Accounting Matrix (SAM) modelling frameworks, whilst a nascent Computable General Equilibrium (CGE) literature has also begun to examine the regional (and national) impact of biofuel development. This paper reviews, compares and evaluates these approaches for modelling the regional economic impacts of biofuels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The regional economic impact of biofuel production depends upon a number of interrelated factors: the specific biofuels feedstock and production technology employed; the sector’s embeddedness to the rest of the economy, through its demand for local resources; the extent to which new activity is created. These issues can be analysed using multisectoral economic models. Some studies have used (fixed price) Input-Output (IO) and Social Accounting Matrix (SAM) modelling frameworks, whilst a nascent Computable General Equilibrium (CGE) literature has also begun to examine the regional (and national) impact of biofuel development. This paper reviews, compares and evaluates these approaches for modelling the regional economic impacts of biofuels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Age is the main clinical determinant of large artery stiffness. Central arteries stiffen progressively with age, whereas peripheral muscular arteries change little with age. A number of clinical studies have analyzed the effects of age on aortic stiffness. Increase of central artery stiffness with age is responsible for earlier wave reflections and changes in pressure wave contours. The stiffening of aorta and other central arteries is a potential risk factor for increased cardiovascular morbidity and mortality. Arterial stiffening with aging is accompanied by an elevation in systolic blood pressure (BP) and pulse pressure (PP). Although arterial stiffening with age is a common situation, it has now been confirmed that older subjects with increased arterial stiffness and elevated PP have higher cardiovascular morbidity and mortality. Increase in aortic stiffness with age occurs gradually and continuously, similarly for men and women. Cross-sectional studies have shown that aortic and carotid stiffness (evaluated by the pulse wave velocity) increase with age by approximately 10% to 15% during a period of 10 years. Women always have 5% to 10% lower stiffness than men of the same age. Although large artery stiffness increases with age independently of the presence of cardiovascular risk factors or other associated conditions, the extent of this increase may depend on several environmental or genetic factors. Hypertension may increase arterial stiffness, especially in older subjects. Among other cardiovascular risk factors, diabetes type 1 and 2 accelerates arterial stiffness, whereas the role of dyslipidemia and tobacco smoking is unclear. Arterial stiffness is also present in several cardiovascular and renal diseases. Patients with heart failure, end stage renal disease, and those with atherosclerotic lesions often develop central artery stiffness. Decreased carotid distensibility, increased arterial thickness, and presence of calcifications and plaques often coexist in the same subject. However, relationships between these three alterations of the arterial wall remain to be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a dynamic model where the accumulation of patents generates an increasing number of claims on sequential innovation. We compare innovation activity under three regimes -patents, no-patents, and patent pools- and find that none of them can reach the first best. We find that the first best can be reached through a decentralized tax-subsidy mechanism, by which innovators receive a subsidy when they innovate, and are taxed with subsequent innovations. This finding implies that optimal transfers work in the exact opposite way as traditional patents. Finally, we consider patents of finite duration and determine the optimal patent length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-measurement of blood pressure (SMBP) is increasingly used to assess blood pressure outside the medical setting. A prerequisite for the wide use of SMBP is the availability of validated devices providing reliable readings when they are handled by patients. This is the case today with a number of fully automated oscillometric apparatuses. A major advantage of SMBP is the great number of readings, which is linked with high reproducibility. Given these advantages, one of the major indications for SMBP is the need for evaluation of antihypertensive treatment, either for individual patients in everyday practice or in clinical trials intended to characterize the effects of blood-pressure-lowering medications. In fact, SMBP is particularly helpful for evaluating resistant hypertension and detecting white-coat effect in patients exhibiting high office blood pressure under antihypertensive therapy. SMBP might also motivate the patient and improve his or her adherence to long-term treatment. Moreover, SMBP can be used as a sensitive technique for evaluating the effect of antihypertensive drugs in clinical trials; it increases the power of comparative trials, allowing one to study fewer patients or to detect smaller differences in blood pressure than would be possible with the office measurement. Therefore, SMBP can be regarded as a valuable technique for the follow-up of treated patients as well as for the assessment of antihypertensive drugs in clinical trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization is a relatively recent tool available to engineers for enhancing transportation project design through improved communication, decision making, and stakeholder feedback. Current visualization techniques include image composites, video composites, 2D drawings, drive-through or fly-through animations, 3D rendering models, virtual reality, and 4D CAD. These methods are used mainly to communicate within the design and construction team and between the team and external stakeholders. Use of visualization improves understanding of design intent and project concepts and facilitates effective decision making. However, visualization tools are typically used for presentation only in large-scale urban projects. Visualization is not widely accepted due to a lack of demonstrated engineering benefits for typical agency projects, such as small- and medium-sized projects, rural projects, and projects where external stakeholder communication is not a major issue. Furthermore, there is a perceived high cost of investment of both financial and human capital in adopting visualization tools. The most advanced visualization technique of virtual reality has only been used in academic research settings, and 4D CAD has been used on a very limited basis for highly complicated specialty projects. However, there are a number of less intensive visualization methods available which may provide some benefit to many agency projects. In this paper, we present the results of a feasibility study examining the use of visualization and simulation applications for improving highway planning, design, construction, and safety and mobility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization is a relatively recent tool available to engineers for enhancing transportation project design through improved communication, decision making, and stakeholder feedback. Current visualization techniques include image composites, video composites, 2D drawings, drive-through or fly-through animations, 3D rendering models, virtual reality, and 4D CAD. These methods are used mainly to communicate within the design and construction team and between the team and external stakeholders. Use of visualization improves understanding of design intent and project concepts and facilitates effective decision making. However, visualization tools are typically used for presentation only in large-scale urban projects. Visualization is not widely accepted due to a lack of demonstrated engineering benefits for typical agency projects, such as small- and medium-sized projects, rural projects, and projects where external stakeholder communication is not a major issue. Furthermore, there is a perceived high cost of investment of both financial and human capital in adopting visualization tools. The most advanced visualization technique of virtual reality has only been used in academic research settings, and 4D CAD has been used on a very limited basis for highly complicated specialty projects. However, there are a number of less intensive visualization methods available which may provide some benefit to many agency projects. In this paper, we present the results of a feasibility study examining the use of visualization and simulation applications for improving highway planning, design, construction, and safety and mobility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is ample epidemiological and anecdotal evidence that a PFO increases the risk of stroke both in young and elderly patients, although only in a modest way: PFOs are more prevalent in patients with cryptogenic (unexplained) stroke than in healthy subjects, and are more prevalent in cryptogenic stroke than in strokes of other causes. Furthermore, multiple case series confirm an association of paradoxical embolism across a PFO in patients with deep vein thrombosis and/or pulmonary emboli.2. Is stroke recurrence risk in PFO-patients really not elevated when compared to PFO-free patients, as suggested by traditional observational studies? This finding is an epidemiological artifact called "the paradox of recurrence risk research" (Dahabreh & Kent, JAMA 2011) and is due to one (minor) risk factor, such as PFO, being wiped out by other, stronger risk factors in the control population.3. Having identified PFO as a risk factor for a first stroke and probably also for recurrences, we have to treat it, because treating risk factors always has paid off. No one would nowadays question the aggressive treatment of other risk factors of stroke such as hypertension, atrial fibrillation, smoking, or hyperlipidemia.4. In order to be effective, the preventive treatment has to control the risk factor (i.e. close effectively the PFO), and has to have little or no side effects. Both these conditions are now fulfilled thanks to increasing expertise of cardiologists with technically advanced closure devices and solid back up by multidisciplinary stroke teams.5. Closing a PFO does not dispense us from treating other stroke risk factors aggressively, given that these are cumulative with PFO.6. The most frequent reason why patients have a stroke recurrence after PFO closure is not that closure is ineffective, but that the initial stroke etiology is insufficiently investigated and not PFO related, and that the recurrence is due to another mechanism because of poor risk factor control.7. Similarly, the randomized CLOSURE study was negative because a) patients were included who had a low chance that their initial event was due to the PFO, b) patients were selected with a low chance that a PFO-related recurrence would occur, c) there was an unacceptable high rate of closure-related side effects, and d) the number of randomized patients was too small for a prevention trial.8. It is only a question of time until a sufficiently large randomized clinical trial with true PFO-related stroke patients and a high PFO-related recurrence risk will be performed and show the effectiveness of this closure9. PFO being a rather modest risk factor for stroke does not mean we should prevent our patients from getting the best available prevention by the best physicians in the best stroke centers Therefore, a PFO-closure performed by an excellent cardiologist following the recommendation of an expert neurovascular specialist after a thorough workup in a leading stroke center is one of the most effective stroke prevention treatments available in 2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metal industries producing thick sections have shown increasing interest in the laser–arc hybrid welding process because of its clear advantages compared with the individual processes of autogenous laser welding and arc welding. One major benefit of laser–arc hybrid welding is that joints with larger gaps can be welded with acceptable quality compared to autogenous laser welding. The laser-arc hybrid welding process has good potential to extend the field of applications of laser technology, and provide significant improvements in weld quality and process efficiency in manufacturing applications. The objective of this research is to present a parameter set-up for laser–arc hybrid welding processes, introduce a methodical comparison of the chosen parameters, and discuss how this technology may be adopted in industrial applications. The research describes the principles, means and applications of different types of laser–arc hybrid welding processes. Conducted experiment processing variables are presented and compared using an analytical model which can also be used for predictive simulations. The main argument in this thesis is that profound understanding of the advanced technology of laser-arc hybrid welding will help improve the productivity of welding in industrial applications. Based on a review of the current knowledge base, important areas for further research are also identified. This thesis consists of two parts. The first part introduces the research topic and discusses laser–arc hybrid welding by characterizing its mechanism and most important variables. The second part comprises four research papers elaborating on the performance of laser– arc hybrid welding in the joining of metals. The study uses quantitative and qualitative research methods which include in-depth, interpretive analyses of results from a number of research groups. In the interpretive analysis, the emphasis is placed on the relevance and usefulness of the investigative results drawn from other research publications. The results of this study contribute to research on laser–arc hybrid welding by increasing understanding of how old and new perspectives on laser–arc hybrid welding are evidenced in industry. The research methodology applied permits continued exploration of how laser–arc hybrid welding and various process factors influence the overall quality of the weld. Thestudy provides a good foundation for future research, creates improved awareness of the laser–arc hybrid welding process, and assists the metal industry to maximize welding productivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation examines knowledge and industrial knowledge creation processes. It looks at the way knowledge is created in industrial processes based on data, which is transformed into information and finally into knowledge. In the context of this dissertation the main tool for industrial knowledge creation are different statistical methods. This dissertation strives to define industrial statistics. This is done using an expert opinion survey, which was sent to a number of industrial statisticians. The survey was conducted to create a definition for this field of applied statistics and to demonstrate the wide applicability of statistical methods to industrial problems. In this part of the dissertation, traditional methods of industrial statistics are introduced. As industrial statistics are the main tool for knowledge creation, the basics of statistical decision making and statistical modeling are also included. The widely known Data Information Knowledge Wisdom (DIKW) hierarchy serves as a theoretical background for this dissertation. The way that data is transformed into information, information into knowledge and knowledge finally into wisdom is used as a theoretical frame of reference. Some scholars have, however, criticized the DIKW model. Based on these different perceptions of the knowledge creation process, a new knowledge creation process, based on statistical methods is proposed. In the context of this dissertation, the data is a source of knowledge in industrial processes. Because of this, the mathematical categorization of data into continuous and discrete types is explained. Different methods for gathering data from processes are clarified as well. There are two methods for data gathering in this dissertation: survey methods and measurements. The enclosed publications provide an example of the wide applicability of statistical methods in industry. In these publications data is gathered using surveys and measurements. Enclosed publications have been chosen so that in each publication, different statistical methods are employed in analyzing of data. There are some similarities between the analysis methods used in the publications, but mainly different methods are used. Based on this dissertation the use of statistical methods for industrial knowledge creation is strongly recommended. With statistical methods it is possible to handle large datasets and different types of statistical analysis results can easily be transformed into knowledge.