941 resultados para Power quality indices


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integration of the measurement activity into the production process is an essential rule in digital enterprise technology, especially for large volume product manufacturing, such as aerospace, shipbuilding, power generation and automotive industries. Measurement resource planning is a structured method of selecting and deploying necessary measurement resources to implement quality aims of product development. In this research, a new mapping approach for measurement resource planning is proposed. Firstly, quality aims are identified in the form of a number of specifications and engineering requirements of one quality characteristics (QCs) at a specific stage of product life cycle, and also measurement systems are classified according to the attribute of QCs. Secondly, a matrix mapping approach for measurement resource planning is outlined together with an optimization algorithm for combination between quality aims and measurement systems. Finally, the proposed methodology has been studied in shipbuilding to solve the problem of measurement resource planning, by which the measurement resources are deployed to satisfy all the quality aims. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extractive industry is characterized by high levels of risk and uncertainty. These attributes create challenges when applying traditional accounting concepts (such as the revenue recognition and matching concepts) to the preparation of financial statements in the industry. The International Accounting Standards Board (2010) states that the objective of general purpose financial statements is to provide useful financial information to assist the capital allocation decisions of existing and potential providers of capital. The usefulness of information is defined as being relevant and faithfully represented so as to best aid in the investment decisions of capital providers. Value relevance research utilizes adaptations of the Ohlson (1995) to assess the attribute of value relevance which is one part of the attributes resulting in useful information. This study firstly examines the value relevance of the financial information disclosed in the financial reports of extractive firms. The findings reveal that the value relevance of information disclosed in the financial reports depends on the circumstances of the firm including sector, size and profitability. Traditional accounting concepts such as the matching concept can be ineffective when applied to small firms who are primarily engaged in nonproduction activities that involve significant levels of uncertainty such as exploration activities or the development of sites. Standard setting bodies such as the International Accounting Standards Board and the Financial Accounting Standards Board have addressed the financial reporting challenges in the extractive industry by allowing a significant amount of accounting flexibility in industryspecific accounting standards, particularly in relation to the accounting treatment of exploration and evaluation expenditure. Therefore, secondly this study examines whether the choice of exploration accounting policy has an effect on the value relevance of information disclosed in the financial reports. The findings show that, in general, the Successful Efforts method produces value relevant information in the financial reports of profitable extractive firms. However, specifically in the oil & gas sector, the Full Cost method produces value relevant asset disclosures if the firm is lossmaking. This indicates that investors in production and non-production orientated firms have different information needs and these needs cannot be simultaneously fulfilled by a single accounting policy. In the mining sector, a preference by large profitable mining companies towards a more conservative policy than either the Full Cost or Successful Efforts methods does not result in more value relevant information being disclosed in the financial reports. This finding supports the fact that the qualitative characteristic of prudence is a form of bias which has a downward effect on asset values. The third aspect of this study is an examination of the effect of corporate governance on the value relevance of disclosures made in the financial reports of extractive firms. The findings show that the key factor influencing the value relevance of financial information is the ability of the directors to select accounting policies which reflect the economic substance of the particular circumstances facing the firms in an effective way. Corporate governance is found to have an effect on value relevance, particularly in the oil & gas sector. However, there is no significant difference between the exploration accounting policy choices made by directors of firms with good systems of corporate governance and those with weak systems of corporate governance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nutrient addition experiments were performed during the austral summer in the Amundsen Sea (Southern Ocean) to investigate the availability of organically bound iron (Fe) to the phytoplankton communities, as well as assess their response to Fe amendment. Changes in autotrophic biomass, pigment concentration, maximum photochemical efficiency of photosystem II, and nutrient concentration were recorded in response to the addition of dissolved free Fe (DFe) and Fe bound to different model ligands. Analysis of pigment concentrations indicated that the autotrophic community was dominated by the prymnesiophyte Phaeocystis antarctica throughout most of the Amundsen Sea, although diatoms dominated in two experiments conducted in the marginal ice zone. Few significant differences in bulk community biomass (particulate organic carbon, nitrogen, and chlorophyll a) were observed, relative to the controls, in treatments with Fe added alone or bound to the ligand phytic acid. In contrast, when Fe was bound to the ligand desferrioxamine B (DFB), decreases in the bulk biomass indices were observed. The concentration of the diatom accessory pigment fucoxanthin showed little response to Fe additions, while the concentration of the P. antarctica-specific pigment, 19'-hexanoyloxyfucoxanthin (19'-hex), decreased when Fe was added alone or bound to the model ligands. Lastly, differences in the nitrate:phosphate (NO3- :PO4**3-) utilization ratio were observed between the Fe-amended treatments, with Fe bound to DFB resulting in the lowest NO3- :PO4**3- uptake ratios (~ 10) and the remaining Fe treatments having higher NO3- :PO4**3- uptake ratios (~ 17). The data are discussed with respect to glacial inputs of Fe in the Amundsen Sea and the bioavailability of Fe. We suggest that the previously observed high NO3- :PO4**3- utilization ratio of P. antarctica is a consequence of its production of dissolved organic matter that acts as ligands and increases the bioavailability of Fe, thereby stimulating the uptake of NO3-.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind generation in highly interconnected power networks creates local and centralised stability issues based on their proximity to conventional synchronous generators and load centres. This paper examines the large disturbance stability issues (i.e. rotor angle and voltage stability) in power networks with geographically distributed wind resources in the context of a number of dispatch scenarios based on profiles of historical wind generation for a real power network. Stability issues have been analysed using novel stability indices developed from dynamic characteristics of wind generation. The results of this study show that localised stability issues worsen when significant penetration of both conventional and wind generation is present due to their non-complementary characteristics. In contrast, network stability improves when either high penetration of wind and synchronous generation is present in the network. Therefore, network regions can be clustered into two distinct stability groups (i.e. superior stability and inferior stability regions). Network stability improves when a voltage control strategy is implemented at wind farms, however both stability clusters remain unchanged irrespective of change in the control strategy. Moreover, this study has shown that the enhanced fault ride-through (FRT) strategy for wind farms can improve both voltage and rotor angle stability locally, but only a marginal improvement is evident in neighbouring regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To study, for the first time, the effect of wearing ready-made glasses and glasses with power determined by self-refraction on children's quality of life. METHODS: This is a randomized, double-masked non-inferiority trial. Children in grades 7 and 8 (age 12-15 years) in nine Chinese secondary schools, with presenting visual acuity (VA) ≤6/12 improved with refraction to ≥6/7.5 bilaterally, refractive error ≤-1.0 D and <2.0 D of anisometropia and astigmatism bilaterally, were randomized to receive ready-made spectacles (RM) or identical-appearing spectacles with power determined by: subjective cycloplegic retinoscopy by a university optometrist (U), a rural refractionist (R) or non-cycloplegic self-refraction (SR). Main study outcome was global score on the National Eye Institute Refractive Error Quality of Life-42 (NEI-RQL-42) after 2 months of wearing study glasses, comparing other groups with the U group, adjusting for baseline score. RESULTS: Only one child (0.18%) was excluded for anisometropia or astigmatism. A total of 426 eligible subjects (mean age 14.2 years, 84.5% without glasses at baseline) were allocated to U [103 (24.2%)], RM [113 (26.5%)], R [108 (25.4%)] and SR [102 (23.9%)] groups, respectively. Baseline and endline score data were available for 398 (93.4%) of subjects. In multiple regression models adjusting for baseline score, older age (p = 0.003) and baseline spectacle wear (p = 0.016), but not study group assignment, were significantly associated with lower final score. CONCLUSION: Quality of life wearing ready-mades or glasses based on self-refraction did not differ from that with cycloplegic refraction by an experienced optometrist in this non-inferiority trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy policies around the world are mandating for a progressive increase in renewable energy production. Extensive grassland areas with low productivity and land use limitations have become target areas for sustainable energy production to avoid competition with food production on the limited available arable land resources and minimize further conversion of grassland into intensively managed energy cropping systems or abandonment. However, the high spatio-temporal variability in botanical composition and biochemical parameters is detrimental to reliable assessment of biomass yield and quality regarding anaerobic digestion. In an approach to assess the performance for predicting biomass using a multi-sensor combination including NIRS, ultra-sonic distance measurements and LAI-2000, biweekly sensor measurements were taken on a pure stand of reed canary grass (Phalaris aruninacea), a legume grass mixture and a diversity mixture with thirty-six species in an experimental extensive two cut management system. Different combinations of the sensor response values were used in multiple regression analysis to improve biomass predictions compared to exclusive sensors. Wavelength bands for sensor specific NDVI-type vegetation indices were selected from the hyperspectral data and evaluated for the biomass prediction as exclusive indices and in combination with LAI and ultra-sonic distance measurements. Ultrasonic sward height was the best to predict biomass in single sensor approaches (R² 0.73 – 0.76). The addition of LAI-2000 improved the prediction performance by up to 30% while NIRS barely improved the prediction performance. In an approach to evaluate broad based prediction of biochemical parameters relevant for anaerobic digestion using hyperspectral NIRS, spectroscopic measurements were taken on biomass from the Jena-Experiment plots in 2008 and 2009. Measurements were conducted on different conditions of the biomass including standing sward, hay and silage and different spectroscopic devices to simulate different preparation and measurement conditions along the process chain for biogas production. Best prediction results were acquired for all constituents at laboratory measurement conditions with dried and ground samples on a bench-top NIRS system (RPD > 3) with a coefficient of determination R2 < 0.9. The same biomass was further used in batch fermentation to analyse the impact of species richness and functional group composition on methane yields using whole crop digestion and pressfluid derived by the Integrated generation of solid Fuel and Biogas from Biomass (IFBB) procedure. Although species richness and functional group composition were largely insignificant, the presence of grasses and legumes in the mixtures were most determining factors influencing methane yields in whole crop digestion. High lignocellulose content and a high C/N ratio in grasses may have reduced the digestibility in the first cut material, excess nitrogen may have inhibited methane production in second cut legumes, while batch experiments proved superior specific methane yields of IFBB press fluids and showed that detrimental effects of the parent material were reduced by the technical treatment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital Image Processing is a rapidly evolving eld with growing applications in Science and Engineering. It involves changing the nature of an image in order to either improve its pictorial information for human interpretation or render it more suitable for autonomous machine perception. One of the major areas of image processing for human vision applications is image enhancement. The principal goal of image enhancement is to improve visual quality of an image, typically by taking advantage of the response of human visual system. Image enhancement methods are carried out usually in the pixel domain. Transform domain methods can often provide another way to interpret and understand image contents. A suitable transform, thus selected, should have less computational complexity. Sequency ordered arrangement of unique MRT (Mapped Real Transform) coe cients can give rise to an integer-to-integer transform, named Sequency based unique MRT (SMRT), suitable for image processing applications. The development of the SMRT from UMRT (Unique MRT), forward & inverse SMRT algorithms and the basis functions are introduced. A few properties of the SMRT are explored and its scope in lossless text compression is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive radio (CR) is fast emerging as a promising technology that can meet the machine-to machine (M2M) communication requirements for spectrum utilization and power control for large number of machines/devices expected to be connected to the Internet-of Things (IoT). Power control in CR as a secondary user can been modelled as a non-cooperative game cost function to quantify and reduce its effects of interference while occupying the same spectrum as primary user without adversely affecting the required quality of service (QoS) in the network. In this paper a power loss exponent that factors in diverse operating environments for IoT is employed in the non-cooperative game cost function to quantify the required power of transmission in the network. The approach would enable various CRs to transmit with lesser power thereby saving battery consumption or increasing the number of secondary users thereby optimizing the network resources efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare the optical properties and device performance of unpackaged InGaN/GaN multiple-quantum-well light-emitting diodes (LEDs) emitting at ∼430 nm grown simultaneously on a high-cost small-size bulk semipolar (11 2 - 2) GaN substrate (Bulk-GaN) and a low-cost large-size (11 2 - 2) GaN template created on patterned (10 1 - 2) r-plane sapphire substrate (PSS-GaN). The Bulk-GaN substrate has the threading dislocation density (TDD) of ∼ and basal-plane stacking fault (BSF) density of 0 cm-1, while the PSS-GaN substrate has the TDD of ∼2 × 108cm-2 and BSF density of ∼1 × 103cm-1. Despite an enhanced light extraction efficiency, the LED grown on PSS-GaN has two-times lower internal quantum efficiency than the LED grown on Bulk-GaN as determined by photoluminescence measurements. The LED grown on PSS-GaN substrate also has about two-times lower output power compared to the LED grown on Bulk-GaN substrate. This lower output power was attributed to the higher TDD and BSF density.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In economics of information theory, credence products are those whose quality is difficult or impossible for consumers to assess, even after they have consumed the product (Darby & Karni, 1973). This dissertation is focused on the content, consumer perception, and power of online reviews for credence services. Economics of information theory has long assumed, without empirical confirmation, that consumers will discount the credibility of claims about credence quality attributes. The same theories predict that because credence services are by definition obscure to the consumer, reviews of credence services are incapable of signaling quality. Our research aims to question these assumptions. In the first essay we examine how the content and structure of online reviews of credence services systematically differ from the content and structure of reviews of experience services and how consumers judge these differences. We have found that online reviews of credence services have either less important or less credible content than reviews of experience services and that consumers do discount the credibility of credence claims. However, while consumers rationally discount the credibility of simple credence claims in a review, more complex argument structure and the inclusion of evidence attenuate this effect. In the second essay we ask, “Can online reviews predict the worst doctors?” We examine the power of online reviews to detect low quality, as measured by state medical board sanctions. We find that online reviews are somewhat predictive of a doctor’s suitability to practice medicine; however, not all the data are useful. Numerical or star ratings provide the strongest quality signal; user-submitted text provides some signal but is subsumed almost completely by ratings. Of the ratings variables in our dataset, we find that punctuality, rather than knowledge, is the strongest predictor of medical board sanctions. These results challenge the definition of credence products, which is a long-standing construct in economics of information theory. Our results also have implications for online review users, review platforms, and for the use of predictive modeling in the context of information systems research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation research points out major challenging problems with current Knowledge Organization (KO) systems, such as subject gateways or web directories: (1) the current systems use traditional knowledge organization systems based on controlled vocabulary which is not very well suited to web resources, and (2) information is organized by professionals not by users, which means it does not reflect intuitively and instantaneously expressed users’ current needs. In order to explore users’ needs, I examined social tags which are user-generated uncontrolled vocabulary. As investment in professionally-developed subject gateways and web directories diminishes (support for both BUBL and Intute, examined in this study, is being discontinued), understanding characteristics of social tagging becomes even more critical. Several researchers have discussed social tagging behavior and its usefulness for classification or retrieval; however, further research is needed to qualitatively and quantitatively investigate social tagging in order to verify its quality and benefit. This research particularly examined the indexing consistency of social tagging in comparison to professional indexing to examine the quality and efficacy of tagging. The data analysis was divided into three phases: analysis of indexing consistency, analysis of tagging effectiveness, and analysis of tag attributes. Most indexing consistency studies have been conducted with a small number of professional indexers, and they tended to exclude users. Furthermore, the studies mainly have focused on physical library collections. This dissertation research bridged these gaps by (1) extending the scope of resources to various web documents indexed by users and (2) employing the Information Retrieval (IR) Vector Space Model (VSM) - based indexing consistency method since it is suitable for dealing with a large number of indexers. As a second phase, an analysis of tagging effectiveness with tagging exhaustivity and tag specificity was conducted to ameliorate the drawbacks of consistency analysis based on only the quantitative measures of vocabulary matching. Finally, to investigate tagging pattern and behaviors, a content analysis on tag attributes was conducted based on the FRBR model. The findings revealed that there was greater consistency over all subjects among taggers compared to that for two groups of professionals. The analysis of tagging exhaustivity and tag specificity in relation to tagging effectiveness was conducted to ameliorate difficulties associated with limitations in the analysis of indexing consistency based on only the quantitative measures of vocabulary matching. Examination of exhaustivity and specificity of social tags provided insights into particular characteristics of tagging behavior and its variation across subjects. To further investigate the quality of tags, a Latent Semantic Analysis (LSA) was conducted to determine to what extent tags are conceptually related to professionals’ keywords and it was found that tags of higher specificity tended to have a higher semantic relatedness to professionals’ keywords. This leads to the conclusion that the term’s power as a differentiator is related to its semantic relatedness to documents. The findings on tag attributes identified the important bibliographic attributes of tags beyond describing subjects or topics of a document. The findings also showed that tags have essential attributes matching those defined in FRBR. Furthermore, in terms of specific subject areas, the findings originally identified that taggers exhibited different tagging behaviors representing distinctive features and tendencies on web documents characterizing digital heterogeneous media resources. These results have led to the conclusion that there should be an increased awareness of diverse user needs by subject in order to improve metadata in practical applications. This dissertation research is the first necessary step to utilize social tagging in digital information organization by verifying the quality and efficacy of social tagging. This dissertation research combined both quantitative (statistics) and qualitative (content analysis using FRBR) approaches to vocabulary analysis of tags which provided a more complete examination of the quality of tags. Through the detailed analysis of tag properties undertaken in this dissertation, we have a clearer understanding of the extent to which social tagging can be used to replace (and in some cases to improve upon) professional indexing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A smart solar photovoltaic grid system is an advent of innovation coherence of information and communications technology (ICT) with power systems control engineering via the internet [1]. This thesis designs and demonstrates a smart solar photovoltaic grid system that is selfhealing, environmental and consumer friendly, but also with the ability to accommodate other renewable sources of energy generation seamlessly, creating a healthy competitive energy industry and optimising energy assets efficiency. This thesis also presents the modelling of an efficient dynamic smart solar photovoltaic power grid system by exploring the maximum power point tracking efficiency, optimisation of the smart solar photovoltaic array through modelling and simulation to improve the quality of design for the solar photovoltaic module. In contrast, over the past decade quite promising results have been published in literature, most of which have not addressed the basis of the research questions in this thesis. The Levenberg-Marquardt and sparse based algorithms have proven to be very effective tools in helping to improve the quality of design for solar photovoltaic modules, minimising the possible relative errors in this thesis. Guided by theoretical and analytical reviews in literature, this research has carefully chosen the MatLab/Simulink software toolbox for modelling and simulation experiments performed on the static smart solar grid system. The auto-correlation coefficient results obtained from the modelling experiments give an accuracy of 99% with negligible mean square error (MSE), root mean square error (RMSE) and standard deviation. This thesis further explores the design and implementation of a robust real-time online solar photovoltaic monitoring system, establishing a comparative study of two solar photovoltaic tracking systems which provide remote access to the harvested energy data. This research made a landmark innovation in designing and implementing a unique approach for online remote access solar photovoltaic monitoring systems providing updated information of the energy produced by the solar photovoltaic module at the site location. In addressing the challenge of online solar photovoltaic monitoring systems, Darfon online data logger device has been systematically integrated into the design for a comparative study of the two solar photovoltaic tracking systems examined in this thesis. The site location for the comparative study of the solar photovoltaic tracking systems is at the National Kaohsiung University of Applied Sciences, Taiwan, R.O.C. The overall comparative energy output efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic monitoring system as observed at the research location site is about 72% based on the total energy produced, estimated money saved and the amount of CO2 reduction achieved. Similarly, in comparing the total amount of energy produced by the two solar photovoltaic tracking systems, the overall daily generated energy for the month of July shows the effectiveness of the azimuthal-altitude tracking systems over the 450 stationary solar photovoltaic system. It was found that the azimuthal-altitude dual-axis tracking systems were about 68.43% efficient compared to the 450 stationary solar photovoltaic systems. Lastly, the overall comparative hourly energy efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic energy system was found to be 74.2% efficient. Results from this research are quite promising and significant in satisfying the purpose of the research objectives and questions posed in the thesis. The new algorithms introduced in this research and the statistical measures applied to the modelling and simulation of a smart static solar photovoltaic grid system performance outperformed other previous works in reviewed literature. Based on this new implementation design of the online data logging systems for solar photovoltaic monitoring, it is possible for the first time to have online on-site information of the energy produced remotely, fault identification and rectification, maintenance and recovery time deployed as fast as possible. The results presented in this research as Internet of things (IoT) on smart solar grid systems are likely to offer real-life experiences especially both to the existing body of knowledge and the future solar photovoltaic energy industry irrespective of the study site location for the comparative solar photovoltaic tracking systems. While the thesis has contributed to the smart solar photovoltaic grid system, it has also highlighted areas of further research and the need to investigate more on improving the choice and quality design for solar photovoltaic modules. Finally, it has also made recommendations for further research in the minimization of the absolute or relative errors in the quality and design of the smart static solar photovoltaic module.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study identifies and compares competing policy stories of key actors involved in the Ecuadorian education reform under President Rafael Correa from 2007-2015. By revealing these competing policy stories the study generates insights into the political and technical aspects of education reform in a context where state capacity has been eroded by decades of neoliberal policies. Since the elections in 2007, President Correa has focused much of his political effort and capital on reconstituting the state’s authority and capacity to not only formulate but also implement public policies. The concentration of power combined with a capacity building agenda allowed the Correa government to advance an ambitious comprehensive education reform with substantive results in equity and quality. At the same time the concentration of power has undermined a more inclusive and participatory approach which are essential for deepening and sustaining the reform. This study underscores both the limits and importance of state control over education; the inevitable conflicts and complexities associated with education reforms that focus on quality; and the limits and importance of participation in reform. Finally, it examines the analytical benefits of understanding governance, participation and quality as socially constructed concepts that are tied to normative and ideological interests.