948 resultados para Multicast Application Level


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regulation has in many cases been delegated to independent agencies, which has led to the question of how democratic accountability of these agencies is ensured. There are few empirical approaches to agency accountability. We offer such an approach, resting upon three propositions. First, we scrutinize agency accountability both de jure (accountability is ensured by formal rights of accountability 'fora' to receive information and impose consequences) and de facto (the capability of fora to use these rights depends on resources and decision costs that affect the credibility of their sanctioning capacity). Second, accountability must be evaluated separately at political, operational and managerial levels. And third, at each level accountability is enacted by a system of several (partially) interdependent fora, forming together an accountability regime. The proposed framework is applied to the case of the German Bundesnetzagentur's accountability regime, which shows its suitability for empirical purposes. Regulatory agencies are often considered as independent, yet accountable. This article provides a realistic framework for the study of accountability 'regimes' in which they are embedded. It emphasizes the need to identify the various actors (accountability fora) to which agencies are formally accountable (parliamentary committees, auditing bodies, courts, and so on) and to consider possible relationships between them. It argues that formal accountability 'on paper', as defined in official documents, does not fully account for de facto accountability, which depends on the resources possessed by the fora (mainly information-processing and decision-making capacities) and the credibility of their sanctioning capacities. The article applies this framework to the German Bundesnetzagentur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: To investigate and quantify the clinical benefits of early versus delayed application of Thomas splints in patients with isolated femur shaft fractures. MATERIALS AND METHODS: Level IV retrospective clinical and radiological analysis of patients presenting from January to December 2012 at a Level 1 Trauma Unit. All skeletally mature patients with isolated femur shaft fractures independently of their mechanism of injury were included. Exclusion criteria were: ipsilateral fracture of the lower limb, neck and supracondylar femur fractures, periprosthetic and incomplete fractures. Their clinical records were analysed for blood transfusion requirements, pulmonary complications, surgery time, duration of hospital stay and analgesic requirements. RESULTS: A total of 106 patients met our inclusion criteria. There were 74 males and 32 females. Fifty seven (54%) patients were in the 'early splinted' group and 49 patients (46%) were in the 'delayed splinted' group (P>0.05). The need for blood transfusion was significantly reduced in the 'early splinted' group (P=0.04). There was a significantly higher rate of pulmonary complications in the 'delayed splinted' group (P=0.008). All other parameters were similar between the two groups. CONCLUSION: The early application of Thomas splints for isolated femur fractures in non-polytraumatised patients has a clinically and statistically significant benefit of reducing the need for blood transfusions and the incidence of pulmonary complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, models of ecological systems can be broadly categorized as ’top-down’ or ’bottom-up’ models, based on the hierarchical level that the model processes are formulated on. The structure of a top-down, also known as phenomenological, population model can be interpreted in terms of population characteristics, but it typically lacks an interpretation on a more basic level. In contrast, bottom-up, also known as mechanistic, population models are derived from assumptions and processes on a more basic level, which allows interpretation of the model parameters in terms of individual behavior. Both approaches, phenomenological and mechanistic modelling, can have their advantages and disadvantages in different situations. However, mechanistically derived models might be better at capturing the properties of the system at hand, and thus give more accurate predictions. In particular, when models are used for evolutionary studies, mechanistic models are more appropriate, since natural selection takes place on the individual level, and in mechanistic models the direct connection between model parameters and individual properties has already been established. The purpose of this thesis is twofold. Firstly, a systematical way to derive mechanistic discrete-time population models is presented. The derivation is based on combining explicitly modelled, continuous processes on the individual level within a reproductive period with a discrete-time maturation process between reproductive periods. Secondly, as an example of how evolutionary studies can be carried out in mechanistic models, the evolution of the timing of reproduction is investigated. Thus, these two lines of research, derivation of mechanistic population models and evolutionary studies, are complementary to each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optode based on thymol blue (TB), an acid-based indicator, has been constructed and evaluated as a detector in FIA system for CO2 determination. The dye was chemically immobilised on the surface of a bifurcated glass optical fibre bundle, using silanisation in organic media. In FIA system, hydrogen carbonate or carbonate samples are injected in a buffer carrier solution, and then are mixed with phosphoric acid solution to generate CO2, which diffuses through a PTFE membrane, in order to be collected in an acceptor carrier fluid, pumped towards to detection cell, in which the optode was adapted. The proposed system presents two linear response ranges, from 1.0 x 10-3 to 1.0 x 10-2 mol l-1, and from 2.0 x 10-2 to 0.10 mol l-1. The sampling frequency was 11 sample h-1, with good repeatability (R.S.D < 4 %, n = 10). In flow conditions the optode lifetime was 170 h. The system was applied in the analysis of commercial mineral water and the results obtained in the hydrogen carbonate determination did not differ significantly from those obtained by potentiometry, at a confidence level of 95 %.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A field experiment conducted with the irrigated rice cultivar BRS Formoso, to assess the efficiency of calcinated serpentinite as a silicon source on grain yield was utilized to study its effect on leaf blast severity and tissue sugar levels. The treatments consisted of five rates of calcinated serpentinite (0, 2, 4, 6, 8 Mg.ha-1) incorporated into the soil prior to planting. The leaf blast severity was reduced at the rate of 2.96% per ton of calcinated serpentinite. The total tissue sugar content decreased significantly as the rates of serpentinite applied increased (R² = 0.83). The relationship between the tissue sugar content and leaf blast severity was linear and positive (R² = 0.81). The decrease in leaf blast severity with increased rates of calcinated serpentinite was also linear (R²= 0.96) and can be ascribed to reduced sugar level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The utilization of organic wastes represents an alternative to recover degraded pasture. The experiment aimed to assess the changes caused by the provision of different organic waste (poultry litter, turkey litter and pig manure) in a medium-textured Oxisol in Brazilian Savanna under degraded pasture. It was applied different doses of waste compared to the use of mineral fertilizers and organic mineral and evaluated the effect on soil parameters (pH, organic matter, phosphorus and potassium) and leaf of Brachiariadecumbens (crude protein, phosphorus and dry mass production). It was observed that application of organic waste did not increase the level of soil organic matter and pH in the surface layer, and the application of turkey litter caused acidification at depths of 0.20-0.40 m and 0.40-0.60 m. There was an increase in P and K in the soil with the application of poultry litter and swine manure. All organic wastes increased the productivity of dry matter and crude protein and phosphorus. The recycling of nutrients via the application of organic waste allows efficiency of most parameters similar to those observed with the use of mineral sources, contributing to improving the nutritional status of soil-plantsystem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the effects of the application of different water depths and nitrogen and potassium doses in the quality of Tanzania grass, in the southern of the state of Tocantins. The experiment was conducted on strips of traditional sprinklers, and used, as treatments, a mixture of fertilizer combinations of N and K2O always in the ratio of 1 N:0.8 K2O. This study determined throughout the experiment: plant height (PH), the crude protein (CP) and neutral detergent fiber (NDF). The highest plant height obtained was 132.4 cm, with a fertilizer dose of 691.71 kg ha-1 in the proportion of N:0.8 K2O, in other words, 384.28 kg ha-1 of N and 307.43 kg ha-1 of K2O, and water depth of 80% of the ETc. The highest crude protein content was 12.2%, with the fertilizer dose application of 700 kg ha-1 yr-1 in the proportion of 1 N to 0.8 of K2O, in other words, 388.89 kg ha-1 of N and 311.11 kg ha-1 of K2O and absence of irrigation. The lowest level of neutral detergent fiber was 60.7% with the application of the smallest dose of fertilizer and highest water depth. It was concluded in this study that there was an increase in plant height by increasing the fertilizer dose and water depth. The crude protein content increased 5.4% in the dry season, by increasing the fertilizer dose and water depth. In the dry season, there was an increase of NDF content by 4.5% by increasing the application of fertilizer and water depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the possible impacts caused in the soil and in the percolate in lysimeters of drainage with application of different rates of swine wastewater (SW) during the cycle of soybean culture and to assess the productivity of it. The experiment was conducted at the Agricultural Engineering Experimental Center of UNIOESTE. The soil was classified as typical Distroferric Red Latosol. There were twenty-four drainage lysimeters in the area in which the soybean was cultivated, cultivar CD 214. Four SW depths (0; 100; 200 and 300 m³ ha-1) were applied to the soil seven days before the sowing in a single application combined with two mineral fertilizations in the sowing (with and without recommended fertilization during sowing), and three repetitions per treatment. It was realized three collections of percolate in each experimental portion, the first was conducted 40 days after sowing (DAS); the second at 72 DAS, and the third at the end of crop cycle (117 DAS). It was evaluated in the percolate the pH, calcium, magnesium, potassium, phosphorus, and total nitrogen. Based on the results, it was possible to observe that the level of K, P and N in the soil increased according tothe increase of SW rates. The levels of K and P in the percolate were higher for higher rates of SW. The productivity was not influenced by the application of SW or by fertilization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To gather and clarify the actual effects of low-level laser therapy on wound healing and its most effective ways of application in human and veterinary medicine.METHODS: We searched original articles published in journals between the years 2000 and 2011, in Spanish, English, French and Portuguese languages, belonging to the following databases: Lilacs, Medline, PubMed and Bireme; Tey should contain the methodological description of the experimental design and parameters used.RESULTS: doses ranging from 3 to 6 J/cm2 appear to be more effective and doses 10 above J/cm2 are associated with deleterious effects. The wavelengths ranging from 632.8 to 1000 nm remain as those that provide more satisfactory results in the wound healing process.CONCLUSION: Low-level laser can be safely applied to accelerate the resolution of cutaneous wounds, although this fact is closely related to the election of parameters such as dose, time of exposure and wavelength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to describe the demographic, clinicopathological, biological and morphometric features of Libyan breast cancer patients. The supporting value of nuclear morphometry and static image cytometry in the sensitivity for detecting breast cancer in conventional fine-needle aspiration biopsies were estimated. The findings were compared with findings in breast cancer in Finland and Nigeria. In addation, the value of ER and PR were evaluated. There were 131 histological samples, 41 cytological samples, and demographic and clinicopathological data from 234 Libyan patients. The Libyan breast cancer is dominantly premenopausal and in this feature it is similar to breast cancer in sub-Saharan Africans, but clearly different from breast cancer in Europeans, whose cancers are dominantly postmenopausal in character. At presention most Libyan patients have locally advanced disease, which is associated with poor survival rates. Nuclear morphometry and image DNA cytometry agree with earlier published data in the Finnish population and indicate that nuclear size and DNA analysis of nuclear content can be used to increase the cytological sensitivity and specificity in doubtful breast lesions, particularly when free cell sampling method is used. Combination of the morphometric data with earlier free cell data gave the following diagnostic guidelines: Range of overlap in free cell samples: 55 μm2 -71 μm2. Cut-off values for diagnostic purposes: Mean nuclear area (MNA) >54 μm2 for 100% detection of malignant cases (specificity 84 %), MNA < 72 μm2 for 100% detection of benign cases (sensitivity 91%). Histomorphometry showed a significant correlation between the MNA and most clinicopathological features, with the strongest association observed for histological grade (p <0.0001). MNA seems to be a prognosticator in Libyan breast cancer (Pearson’s test r = - 0.29, p = 0.019), but at lower level of significance than in the European material. A corresponding relationship was not found in shape-related morphometric features. ER and PR staining scores were in correlation with the clinical stage (p= 0.017, and 0.015, respectively), and also associated with lymph node negative patients (p=0.03, p=0.05, respectively). Receptor-positive (HR+) patients had a better survival. The fraction of HR+ cases among Libyan breast cancers is about the same as the fraction of positive cases in European breast cancer. The study suggests that also weak staining (corresponding to as few as 1% positive cells) has prognostic value. The prognostic significance may be associated with the practice to use antihormonal therapy in HR+ cases. The low survival and advanced presentation is associated with active cell proliferation, atypical nuclear morphology and aneuploid nuclear DNA content in Libyan breast cancer patients. The findings support the idea that breast cancer is not one type of disease, but should probably be classified into premenopausal and post menopausal types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.