9 resultados para Development of new products

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary purpose of this study was to investigate agreement among five equations by which clinicians estimate water requirements (EWR) and to determine how well these equations predict total water intake (TWI). The Institute of Medicine has used TWI as a measure of water requirements. A secondary goal of this study was to develop practical equations to predict TWI. These equations could then be considered accurate predictors of an individual’s water requirement. ^ Regressions were performed to determine agreement between the five equations and between the five equations and TWI using NHANES 1999–2004. The criteria for agreement was (1) strong correlation coefficients between all comparisons and (2) regression line that was not significantly different when compared to the line of equality (x=y) i.e., the 95% CI of the slope and intercept must include one and zero, respectively. Correlations were performed to determine association between fat-free mass (FFM) and TWI. Clinically significant variables were selected to build equations for predicting TWI. All analyses were performed with SAS software and were weighted to account for the complex survey design and for oversampling. ^ Results showed that the five EWR equations were strongly correlated but did not agree with each other. Further, the EWR equations were all weakly associated to TWI and lacked agreement with TWI. The strongest agreement between the NRC equation and TWI explained only 8.1% of the variability of TWI. Fat-free mass was positively correlated to TWI. Two models were created to predict TWI. Both models included the variables, race/ethnicity, kcals, age, and height, but one model also included FFM and gender. The other model included BMI and osmolality. Neither model accounted for more than 28% of the variability of TWI. These results provide evidence that estimates of water requirements would vary depending upon which EWR equation was selected by the clinician. None of the existing EWR equations predicted TWI, nor could a prediction equation be created which explained a satisfactory amount of variance in TWI. A good estimate of water requirements may not be predicted by TWI. Future research should focus on using more valid measures to predict water requirements.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tumor functional volume (FV) and its mean activity concentration (mAC) are the quantities derived from positron emission tomography (PET). These quantities are used for estimating radiation dose for a therapy, evaluating the progression of a disease and also use it as a prognostic indicator for predicting outcome. PET images have low resolution, high noise and affected by partial volume effect (PVE). Manually segmenting each tumor is very cumbersome and very hard to reproduce. To solve the above problem I developed an algorithm, called iterative deconvolution thresholding segmentation (IDTS) algorithm; the algorithm segment the tumor, measures the FV, correct for the PVE and calculates mAC. The algorithm corrects for the PVE without the need to estimate camera's point spread function (PSF); also does not require optimizing for a specific camera. My algorithm was tested in physical phantom studies, where hollow spheres (0.5-16 ml) were used to represent tumors with a homogeneous activity distribution. It was also tested on irregular shaped tumors with a heterogeneous activity profile which were acquired using physical and simulated phantom. The physical phantom studies were performed with different signal to background ratios (SBR) and with different acquisition times (1-5 min). The algorithm was applied on ten clinical data where the results were compared with manual segmentation and fixed percentage thresholding method called T50 and T60 in which 50% and 60% of the maximum intensity respectively is used as threshold. The average error in FV and mAC calculation was 30% and -35% for 0.5 ml tumor. The average error FV and mAC calculation were ~5% for 16 ml tumor. The overall FV error was ∼10% for heterogeneous tumors in physical and simulated phantom data. The FV and mAC error for clinical image compared to manual segmentation was around -17% and 15% respectively. In summary my algorithm has potential to be applied on data acquired from different cameras as its not dependent on knowing the camera's PSF. The algorithm can also improve dose estimation and treatment planning.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past two decades, many researchers have developed methods for the detection of structural defects at the early stages to operate the aerospace vehicles safely and to reduce the operating costs. The Surface Response to Excitation (SuRE) method is one of these approaches developed at FIU to reduce the cost and size of the equipment. The SuRE method excites the surface at a series of frequencies and monitors the propagation characteristics of the generated waves. The amplitude of the waves reaching to any point on the surface varies with frequency; however, it remains consistent as long as the integrity and strain distribution on the part is consistent. These spectral characteristics change when cracks develop or the strain distribution changes. The SHM methods may be used for many applications, from the detection of loose screws to the monitoring of manufacturing operations. A scanning laser vibrometer was used in this study to investigate the characteristics of the spectral changes at different points on the parts. The study started with detecting a load on a plate and estimating its location. The modifications on the part with manufacturing operations were detected and the Part-Based Manufacturing Process Performance Monitoring (PbPPM) method was developed. Hardware was prepared to demonstrate the feasibility of the proposed methods in real time. Using low-cost piezoelectric elements and the non-contact scanning laser vibrometer successfully, the data was collected for the SuRE and PbPPM methods. Locational force, loose bolts and material loss could be easily detected by comparing the spectral characteristics of the arriving waves. On-line methods used fast computational methods for estimating the spectrum and detecting the changing operational conditions from sum of the squares of the variations. Neural networks classified the spectrums when the desktop – DSP combination was used. The results demonstrated the feasibility of the SuRE and PbPPM methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past two decades, many researchers have developed methods for the detection of structural defects at the early stages to operate the aerospace vehicles safely and to reduce the operating costs. The Surface Response to Excitation (SuRE) method is one of these approaches developed at FIU to reduce the cost and size of the equipment. The SuRE method excites the surface at a series of frequencies and monitors the propagation characteristics of the generated waves. The amplitude of the waves reaching to any point on the surface varies with frequency; however, it remains consistent as long as the integrity and strain distribution on the part is consistent. These spectral characteristics change when cracks develop or the strain distribution changes. The SHM methods may be used for many applications, from the detection of loose screws to the monitoring of manufacturing operations. A scanning laser vibrometer was used in this study to investigate the characteristics of the spectral changes at different points on the parts. The study started with detecting a load on a plate and estimating its location. The modifications on the part with manufacturing operations were detected and the Part-Based Manufacturing Process Performance Monitoring (PbPPM) method was developed. Hardware was prepared to demonstrate the feasibility of the proposed methods in real time. Using low-cost piezoelectric elements and the non-contact scanning laser vibrometer successfully, the data was collected for the SuRE and PbPPM methods. Locational force, loose bolts and material loss could be easily detected by comparing the spectral characteristics of the arriving waves. On-line methods used fast computational methods for estimating the spectrum and detecting the changing operational conditions from sum of the squares of the variations. Neural networks classified the spectrums when the desktop – DSP combination was used. The results demonstrated the feasibility of the SuRE and PbPPM methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tumor functional volume (FV) and its mean activity concentration (mAC) are the quantities derived from positron emission tomography (PET). These quantities are used for estimating radiation dose for a therapy, evaluating the progression of a disease and also use it as a prognostic indicator for predicting outcome. PET images have low resolution, high noise and affected by partial volume effect (PVE). Manually segmenting each tumor is very cumbersome and very hard to reproduce. To solve the above problem I developed an algorithm, called iterative deconvolution thresholding segmentation (IDTS) algorithm; the algorithm segment the tumor, measures the FV, correct for the PVE and calculates mAC. The algorithm corrects for the PVE without the need to estimate camera’s point spread function (PSF); also does not require optimizing for a specific camera. My algorithm was tested in physical phantom studies, where hollow spheres (0.5-16 ml) were used to represent tumors with a homogeneous activity distribution. It was also tested on irregular shaped tumors with a heterogeneous activity profile which were acquired using physical and simulated phantom. The physical phantom studies were performed with different signal to background ratios (SBR) and with different acquisition times (1-5 min). The algorithm was applied on ten clinical data where the results were compared with manual segmentation and fixed percentage thresholding method called T50 and T60 in which 50% and 60% of the maximum intensity respectively is used as threshold. The average error in FV and mAC calculation was 30% and -35% for 0.5 ml tumor. The average error FV and mAC calculation were ~5% for 16 ml tumor. The overall FV error was ~10% for heterogeneous tumors in physical and simulated phantom data. The FV and mAC error for clinical image compared to manual segmentation was around -17% and 15% respectively. In summary my algorithm has potential to be applied on data acquired from different cameras as its not dependent on knowing the camera’s PSF. The algorithm can also improve dose estimation and treatment planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the reusability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective, providing new features and enriching the mobile user’s experience through a broad scope of potential applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the usability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective,providing new features and enriching the mobile user’s experience through a broad scope of potential applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to (a) develop an evaluation instrument capable of rating students' perceptions of the instructional quality of an online course and the instructor’s performance, and (b) validate the proposed instrument with a study conducted at a major public university. The instrument was based upon the Seven Principles of Good Practice for Undergraduate Education (Chickering & Gamson, 1987). The study examined four specific questions. 1. Is the underlying factor structure of the new instrument consistent with Chickering and Gamson's Seven Principles? 2. Is the factor structure of the new instrument invariant for male and female students? 3. Are the scores on the new instrument related students’ expected grades? 4. Are the scores on the new instrument related to the students' perceived course workload? ^ The instrument was designed to measure students’ levels of satisfaction with their instruction, and also gathered information concerning the students’ sex, the expected grade in the course, and the students’ perceptions of the amount of work required by the course. A cluster sample consisting of an array of online courses across the disciplines yielded a total 297 students who responded to the online survey. The students for each course selected were asked to rate their instructors with the newly developed instrument. ^ Question 1 was answered using exploratory factor analysis, and yielded a factor structure similar to the Seven Principles.^ Question 2 was answered by separately factor-analyzing the responses of male and female students and comparing the factor structures. The resulting factor structures for men and women were different. However, 14 items could be realigned under five factors that paralleled some of the Seven Principles. When the scores of only those 14 items were entered in two principal components factor analyses using only men and only women, respectively and restricting the factor structure to five factors, the factor structures were the same for men and women.^ A weak positive relationship between students’ expected grades and their scores on the instrument was found (Question 3). There was no relationship between students’ perceived workloads for the course and their scores on the instrument (Question 4).^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Implicit in current design practice of minimum uplift capacity, is the assumption that the connection's capacity is proportional to the number of fasteners per connection joint. This assumption may overestimate the capacity of joints by a factor of two or more and maybe the cause of connection failures in extreme wind events. The current research serves to modify the current practice by proposing a realistic relationship between the number of fasteners and the capacity of the joint. The research is also aimed at further development of non-intrusive continuous load path (CLP) connection system using Glass Fiber Reinforced Polymer (GFRP) and epoxy. Suitable designs were developed for stud to top plate and gable end connections and tests were performed to evaluate the ultimate load, creep and fatigue behavior. The objective was to determine the performance of the connections under simulated sustained hurricane conditions. The performance of the new connections was satisfactory.