988 resultados para Segmented thermoplastic


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation introduces a new system for handwritten text recognition based on an improved neural network design. Most of the existing neural networks treat mean square error function as the standard error function. The system as proposed in this dissertation utilizes the mean quartic error function, where the third and fourth derivatives are non-zero. Consequently, many improvements on the training methods were achieved. The training results are carefully assessed before and after the update. To evaluate the performance of a training system, there are three essential factors to be considered, and they are from high to low importance priority: (1) error rate on testing set, (2) processing time needed to recognize a segmented character and (3) the total training time and subsequently the total testing time. It is observed that bounded training methods accelerate the training process, while semi-third order training methods, next-minimal training methods, and preprocessing operations reduce the error rate on the testing set. Empirical observations suggest that two combinations of training methods are needed for different case character recognition. Since character segmentation is required for word and sentence recognition, this dissertation provides also an effective rule-based segmentation method, which is different from the conventional adaptive segmentation methods. Dictionary-based correction is utilized to correct mistakes resulting from the recognition and segmentation phases. The integration of the segmentation methods with the handwritten character recognition algorithm yielded an accuracy of 92% for lower case characters and 97% for upper case characters. In the testing phase, the database consists of 20,000 handwritten characters, with 10,000 for each case. The testing phase on the recognition 10,000 handwritten characters required 8.5 seconds in processing time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This phenomenological study explored Black male law enforcement officers' perspectives of how racial profiling shaped their decisions to explore and commit to a law enforcement career. Criterion and snow ball sampling was used to obtain the 17 participants for this study. Super's (1990) archway model was used as the theoretical framework. The archway model "is designed to bring out the segmented but unified and developmental nature of career development, to highlight the segments, and to make their origin clear" (Super, 1990, p. 201). Interview data were analyzed using inductive, deductive, and comparative analyses. Three themes emerged from the inductive analysis of the data: (a) color and/or race does matter, (b) putting on the badge, and (c) too black to be blue and too blue to be black. The deductive analysis used a priori coding that was based on Super's (1990) archway model. The deductive analysis revealed the participants' career exploration was influenced by their knowledge of racial profiling and how others view them. The comparative analysis between the inductive themes and deductive findings found the theme "color and/or race does matter" was present in the relationships between and within all segments of Super's (1990) model. The comparative analysis also revealed an expanded notion of self-concept for Black males – marginalized and/or oppressed individuals. Self-concepts, "such as self-efficacy, self-esteem, and role self-concepts, being combinations of traits ascribed to oneself" (Super, 1990, p. 202) do not completely address the self-concept of marginalized and/or oppressed individuals. The self-concept of marginalized and/or oppressed individuals is self-efficacy, self-esteem, traits ascribed to oneself expanded by their awareness of how others view them. (DuBois, 1995; Freire, 1970; Sheared, 1990; Super, 1990; Young, 1990). Ultimately, self-concept is utilized to make career and life decisions. Current human resource policies and practices do not take into consideration that negative police contact could be the result of racial profiling. Current human resource hiring guidelines penalize individuals who have had negative police contact. Therefore, racial profiling is a discriminatory act that can effectively circumvent U.S. Equal Employment Opportunities Commission laws and serve as a boundary mechanism to employment (Rocco & Gallagher, 2004).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moving objects database systems are the most challenging sub-category among Spatio-Temporal database systems. A database system that updates in real-time the location information of GPS-equipped moving vehicles has to meet even stricter requirements. Currently existing data storage models and indexing mechanisms work well only when the number of moving objects in the system is relatively small. This dissertation research aimed at the real-time tracking and history retrieval of massive numbers of vehicles moving on road networks. A total solution has been provided for the real-time update of the vehicles’ location and motion information, range queries on current and history data, and prediction of vehicles’ movement in the near future. To achieve these goals, a new approach called Segmented Time Associated to Partitioned Space (STAPS) was first proposed in this dissertation for building and manipulating the indexing structures for moving objects databases. Applying the STAPS approach, an indexing structure of associating a time interval tree to each road segment was developed for real-time database systems of vehicles moving on road networks. The indexing structure uses affordable storage to support real-time data updates and efficient query processing. The data update and query processing performance it provides is consistent without restrictions such as a time window or assuming linear moving trajectories. An application system design based on distributed system architecture with centralized organization was developed to maximally support the proposed data and indexing structures. The suggested system architecture is highly scalable and flexible. Finally, based on a real-world application model of vehicles moving in region-wide, main issues on the implementation of such a system were addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diazotrophic (N2-fixing) cyanobacteria provide the biological source of new nitrogen for large parts of the ocean. However, little is known about their sensitivity to global change. Here we show that the single most important nitrogen fixer in today's ocean, Trichodesmium, is strongly affected by changes in CO2 concentrations. Cell division rate doubled with rising CO2 (glacial to projected year 2100 levels) prompting lower carbon, nitrogen and phosphorus cellular contents, and reduced cell dimensions. N2 fixation rates per unit of phosphorus utilization as well as C:P and N:P ratios more than doubled at high CO2, with no change in C:N ratios. This could enhance the productivity of N-limited oligotrophic oceans, drive some of these areas into P limitation, and increase biological carbon sequestration in the ocean. The observed CO2 sensitivity of Trichodesmium could thereby provide a strong negative feedback to atmospheric CO2 increase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The response of the coccolithophore Emiliania huxleyi to rising CO2 concentrations is well documented for acclimated cultures where cells are exposed to the CO2 treatments for several generations prior to the experiment. The exact number of generations required for acclimation to CO2-induced changes in seawater carbonate chemistry, however, is unknown. Here we show that Emiliania huxleyi's short-term response (26 h) after cultures (grown at 500 µatm) were abruptly exposed to changed CO2 concentrations (~190, 410, 800 and 1500 ?atm) is similar to that obtained with acclimated cultures under comparable conditions in earlier studies. Most importantly, from the lower CO2 levels (190 and 410 ?atm) to 750 and 1500 µatm calcification decreased and organic carbon fixation increased within the first 8 to 14 h after exposing the cultures to changes in carbonate chemistry. This suggests that Emiliania huxleyi rapidly alters the rates of essential metabolical processes in response to changes in seawater carbonate chemistry, establishing a new physiological "state" (acclimation) within a matter of hours. If this relatively rapid response applies to other phytoplankton species, it may simplify interpretation of studies with natural communities (e.g. mesocosm studies and ship-board incubations), where often it is not feasible to allow for a pre-conditioning phase before starting experimental incubations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work evaluate the technical characteristics of the fibers grown in settlements Guamaré, colored cotton seeds were donated existing in the Germplasm Bank of Embrapa Cotton. We sought through the breeding program, raising the resistance, fineness, length and uniformity of cotton fibers, as well as stabilize the staining of fibers in the BRS Topaz, BRS Brown and BRS Green shades and raise their productivity in the field. First, the individual selections to test progeny seeds, and thereafter the hybridization method followed by family selection to obtain variations in the color tones were performed. The BRS Topaz, BRS Brown and BRS Green varieties were produced, analyzed and compared with existing cottons in the region which is the White cotton. The properties amount of impurities and neps, length, length uniformity, short fiber content, fineness and tensile strength of the fibers were sized in Classifiber, NATI, Pressley and Micronaire devices. 10 trials each with 10 tests for all four fiber types were carried out. The White and Topaz fibers showed greater length (32-34mm) and greater resistance (7.94 lb/mg and 7.97 lb/mg respectively) and showed finesse with lower micronaire index 3,71μg/inch and 3, 73μg/inch and a low rate of short fibers. The results were very promising for the use of genetically improved cotton in the manufacturing of fabric and yarn in the textile industry. The fibers were brown colored cotton used in the manufacture of a composite fiber with thermoplastic resin

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic detection of blood components is an important topic in the field of hematology. The segmentation is an important stage because it allows components to be grouped into common areas and processed separately and leukocyte differential classification enables them to be analyzed separately. With the auto-segmentation and differential classification, this work is contributing to the analysis process of blood components by providing tools that reduce the manual labor and increasing its accuracy and efficiency. Using techniques of digital image processing associated with a generic and automatic fuzzy approach, this work proposes two Fuzzy Inference Systems, defined as I and II, for autosegmentation of blood components and leukocyte differential classification, respectively, in microscopic images smears. Using the Fuzzy Inference System I, the proposed technique performs the segmentation of the image in four regions: the leukocyte’s nucleus and cytoplasm, erythrocyte and plasma area and using the Fuzzy Inference System II and the segmented leukocyte (nucleus and cytoplasm) classify them differentially in five types: basophils, eosinophils, lymphocytes, monocytes and neutrophils. Were used for testing 530 images containing microscopic samples of blood smears with different methods. The images were processed and its accuracy indices and Gold Standards were calculated and compared with the manual results and other results found at literature for the same problems. Regarding segmentation, a technique developed showed percentages of accuracy of 97.31% for leukocytes, 95.39% to erythrocytes and 95.06% for blood plasma. As for the differential classification, the percentage varied between 92.98% and 98.39% for the different leukocyte types. In addition to promoting auto-segmentation and differential classification, the proposed technique also contributes to the definition of new descriptors and the construction of an image database using various processes hematological staining

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) consists of distributed devices in an area in order to monitor physical variables such as temperature, pressure, vibration, motion and environmental conditions in places where wired networks would be difficult or impractical to implement, for example, industrial applications of difficult access, monitoring and control of oil wells on-shore or off-shore, monitoring of large areas of agricultural and animal farming, among others. To be viable, a WSN should have important requirements such as low cost, low latency, and especially low power consumption. However, to ensure these requirements, these networks suffer from limited resources, and eventually being used in hostile environments, leading to high failure rates, such as segmented routing, mes sage loss, reducing efficiency, and compromising the entire network, inclusive. This work aims to present the FTE-LEACH, a fault tolerant and energy efficient routing protocol that maintains efficiency in communication and dissemination of data.This protocol was developed based on the IEEE 802.15.4 standard and suitable for industrial networks with limited energy resources

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents the numerical analysis of nonlinear trusses summited to thermomechanical actions with Finite Element Method (FEM). The proposed formulation is so-called positional FEM and it is based on the minimum potential energy theorem written according to nodal positions, instead of displacements. The study herein presented considers the effects of geometric and material nonlinearities. Related to dynamic problems, a comparison between different time integration algorithms is performed. The formulation is extended to impact problems between trusses and rigid wall, where the nodal positions are constrained considering nullpenetration condition. In addition, it is presented a thermodynamically consistent formulation, based on the first and second law of thermodynamics and the Helmholtz free-energy for analyzing dynamic problems of truss structures with thermoelastic and thermoplastic behavior. The numerical results of the proposed formulation are compared with examples found in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, a numerical design approach has been proposed and developed based on the transmission matrix method in order to characterize periodic and quasi-periodic photonic structures in silicon-on-insulator. The approach and its performance have been extensively tested with specific structures in 2D and its validity has been verified in 3D.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lung cancer is the most common of malignant tumors, with 1.59 million new cases worldwide in 2012. Early detection is the main factor to determine the survival of patients affected by this disease. Furthermore, the correct classification is important to define the most appropriate therapeutic approach as well as suggest the prognosis and the clinical disease evolution. Among the exams used to detect lung cancer, computed tomography have been the most indicated. However, CT images are naturally complex and even experts medical are subject to fault detection or classification. In order to assist the detection of malignant tumors, computer-aided diagnosis systems have been developed to aid reduce the amount of false positives biopsies. In this work it was developed an automatic classification system of pulmonary nodules on CT images by using Artificial Neural Networks. Morphological, texture and intensity attributes were extracted from lung nodules cut tomographic images using elliptical regions of interest that they were subsequently segmented by Otsu method. These features were selected through statistical tests that compare populations (T test of Student and U test of Mann-Whitney); from which it originated a ranking. The features after selected, were inserted in Artificial Neural Networks (backpropagation) to compose two types of classification; one to classify nodules in malignant and benign (network 1); and another to classify two types of malignancies (network 2); featuring a cascade classifier. The best networks were associated and its performance was measured by the area under the ROC curve, where the network 1 and network 2 achieved performance equal to 0.901 and 0.892 respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To determine whether the ‘through-focus’ aberrations of a multifocal and accommodative intraocular lens (IOL) implanted patient can be used to provide rapid and reliable measures of their subjective range of clear vision. Methods: Eyes that had been implanted with a concentric (n = 8), segmented (n = 10) or accommodating (n = 6) intraocular lenses (mean age 62.9 ± 8.9 years; range 46-79 years) for over a year underwent simultaneous monocular subjective (electronic logMAR test chart at 4m with letters randomised between presentations) and objective (Aston open-field aberrometer) defocus curve testing for levels of defocus between +1.50 to -5.00DS in -0.50DS steps, in a randomised order. Pupil size and ocular aberration (a combination of the patient’s and the defocus inducing lens aberrations) at each level of blur was measured by the aberrometer. Visual acuity was measured subjectively at each level of defocus to determine the traditional defocus curve. Objective acuity was predicted using image quality metrics. Results: The range of clear focus differed between the three IOL types (F=15.506, P=0.001) as well as between subjective and objective defocus curves (F=6.685, p=0.049). There was no statistically significant difference between subjective and objective defocus curves in the segmented or concentric ring MIOL group (P>0.05). However a difference was found between the two measures and the accommodating IOL group (P<0.001). Mean Delta logMAR (predicted minus measured logMAR) across all target vergences was -0.06 ± 0.19 logMAR. Predicted logMAR defocus curves for the multifocal IOLs did not show a near vision addition peak, unlike the subjective measurement of visual acuity. However, there was a strong positive correlation between measured and predicted logMAR for all three IOLs (Pearson’s correlation: P<0.001). Conclusions: Current subjective procedures are lengthy and do not enable important additional measures such as defocus curves under differently luminance or contrast levels to be assessed, which may limit our understanding of MIOL performance in real-world conditions. In general objective aberrometry measures correlated well with the subjective assessment indicating the relative robustness of this technique in evaluating post-operative success with segmented and concentric ring MIOL.