881 resultados para Interaction modeling. Model-based development. Interaction evaluation.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a model for development of project proposals by students as an approach to teaching information technology while promoting entrepreneurship and reflection. In teams of 3 to 5 participants, students elaborate a project proposal on a topic they have negotiated with each other and with the teacher. The project domain is related to the practical application of state-of-theart information technology in areas of substantial public interest or of immediate interest to the participants. This gives them ample opportunities for reflection not only on technical but also on social, economic, environmental and other dimensions of information technology. This approach has long been used with students of different years and programs of study at the Faculty of Mathematics and Informatics, Plovdiv University “Paisiy Hilendarski”. It has been found to develop all eight key competences for lifelong learning set forth in the Reference Framework and procedural skills required in real life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urinary bladder diseases are a common problem throughout the world and often difficult to accurately diagnose. Furthermore, they pose a heavy financial burden on health services. Urinary bladder tissue from male pigs was spectrophotometrically measured and the resulting data used to calculate the absorption, transmission, and reflectance parameters, along with the derived coefficients of scattering and absorption. These were employed to create a "generic" computational bladder model based on optical properties, simulating the propagation of photons through the tissue at different wavelengths. Using the Monte-Carlo method and fluorescence spectra of UV and blue excited wavelength, diagnostically important biomarkers were modeled. Additionally, the multifunctional noninvasive diagnostics system "LAKK-M" was used to gather fluorescence data to further provide essential comparisons. The ultimate goal of the study was to successfully simulate the effects of varying excited radiation wavelengths on bladder tissue to determine the effectiveness of photonics diagnostic devices. With increased accuracy, this model could be used to reliably aid in differentiating healthy and pathological tissues within the bladder and potentially other hollow organs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the effect of blood absorption on the endogenous fluorescence signal intensity of biological tissues. Experimental studies were conducted to identify these effects. To register the fluorescence intensity, the fluorescence spectroscopy method was employed. The intensity of the blood flow was measured by laser Doppler flowmetry. We proposed one possible implementation of the Monte Carlo method for the theoretical analysis of the effect of blood on the fluorescence signals. The simulation is constructed as a four-layer skin optical model based on the known optical parameters of the skin with different levels of blood supply. With the help of the simulation, we demonstrate how the level of blood supply can affect the appearance of the fluorescence spectra. In addition, to describe the properties of biological tissue, which may affect the fluorescence spectra, we turned to the method of diffuse reflectance spectroscopy (DRS). Using the spectral data provided by the DRS, the tissue attenuation effect can be extracted and used to correct the fluorescence spectra.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is about the development and the application of an ESRI ArcGIS tool which implements multi-layer, feed-forward artificial neural network (ANN) to study the climate envelope of species. The supervised learning is achieved by backpropagation algorithm. Based on the distribution and the grids of the climate (and edaphic data) of the reference and future periods the tool predicts the future potential distribution of the studied species. The trained network can be saved and loaded. A modeling result based on the distribution of European larch (Larix decidua Mill.) is presented as a case study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present dissertation was to evaluate the internal validity of symptoms of four common anxiety disorders included in the Diagnostic and Statistical Manual of Mental Disorders fourth edition (text revision) (DSM-IV-TR; American Psychiatric Association, 2000), namely, separation anxiety disorder (SAD), social phobia (SOP), specific phobia (SP), and generalized anxiety disorder (GAD), in a sample of 625 youth (ages 6 to 17 years) referred to an anxiety disorders clinic and 479 parents. Confirmatory factor analyses (CFAs) were conducted on the dichotomous items of the SAD, SOP, SP, and GAD sections of the youth and parent versions of the Anxiety Disorders Interview Schedule for DSM-IV (ADIS-IV: C/P; Silverman & Albano, 1996) to test and compare a number of factor models including a factor model based on the DSM. Contrary to predictions, findings from CFAs showed that a correlated model with five factors of SAD, SOP, SP, GAD worry, and GAD somatic distress, provided the best fit of the youth data as well as the parent data. Multiple group CFAs supported the metric invariance of the correlated five factor model across boys and girls. Thus, the present study’s finding supports the internal validity of DSM-IV SAD, SOP, and SP, but raises doubt regarding the internal validity of GAD.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge-based radiation treatment is an emerging concept in radiotherapy. It

mainly refers to the technique that can guide or automate treatment planning in

clinic by learning from prior knowledge. Dierent models are developed to realize

it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This

model can automatically determine both beam conguration and optimization ob-

jectives with non-coplanar beams based on patient-specic anatomical information.

Although plans automatically generated by this model demonstrate equivalent or

better dosimetric quality compared to clinical approved plans, its validity and gener-

ality are limited due to the empirical assignment to a coecient called angle spread

constraint dened in the beam eciency index used for beam ranking. To eliminate

these limitations, a systematic study on this coecient is needed to acquire evidences

for its optimal value.

To achieve this purpose, eleven lung cancer patients with complex tumor shape

with non-coplanar beams adopted in clinical approved plans were retrospectively

studied in the frame of the automatic lung IMRT treatment algorithm. The primary

and boost plans used in three patients were treated as dierent cases due to the

dierent target size and shape. A total of 14 lung cases, thus, were re-planned using

the knowledge-based automatic lung IMRT planning algorithm by varying angle

spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency

index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good

as possible. Important dosimetric parameters for PTV and OARs, quantitatively

re

ecting the plan quality, were extracted from the DVHs and analyzed as a function

of angle spread constraint for each case. Comparisons of these parameters between

clinical plans and model-based plans were evaluated by two-sampled Students t-tests,

and regression analysis on a composite index built on the percentage errors between

dosimetric parameters in the model-based plans and those in the clinical plans as a

function of angle spread constraint was performed.

Results show that model-based plans generally have equivalent or better quality

than clinical approved plans, qualitatively and quantitatively. All dosimetric param-

eters except those for lungs in the automatically generated plans are statistically

better or comparable to those in the clinical plans. On average, more than 15% re-

duction on conformity index and homogeneity index for PTV and V40, V60 for heart

while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The

intra-plan comparison among model-based plans demonstrates that plan quality does

not change much with angle spread constraint larger than 0.4. Further examination

on the variation curve of the composite index as a function of angle spread constraint

shows that 0.6 is the optimal value that can result in statistically the best achievable

plans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, the trend towards decentralization is far-reaching. Proponents of decentralization have argued that decentralization promotes responsive and accountable local government by shortening the distance between local representatives and their constituency. However, in this paper, I focus on the countervailing effect of decentralization on the accountability mechanism, arguing that decentralization, which increases the number of actors eligible for policy making and implementation in governance as a whole, may blur lines of responsibility, thus weakening citizens’ ability to sanction government in election. By using the ordinary least squares (OLS) interaction model based on historical panel data for 78 countries in the 2002 – 2010 period, I test the hypothesis that as the number of government tiers increases, there will be a negative interaction between the number of government tiers and decentralization policies. The regression results show empirical evidence that decentralization policies, having a positive impact on governance under a relatively simple form of multilevel governance, have no more statistically significant effects as the complexity of government structure exceeds a certain degree. In particular, this paper found that the presence of intergovernmental meeting with legally binding authority have a negative impact on governance when the complexity of government structure reaches to the highest level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Globally, the current state of freshwater resource management is insufficient and impeding the chance at a sustainable future. Human interference within the natural hydrologic cycle is becoming dangerously irreversible and the need to redefine resource managerial approaches is imminent. This research involves the development of a coupled natural-human freshwater resource supply model using a System Dynamics approach. The model was applied to two case studies, Somalia, Africa and the Phoenix Active Management Area in Arizona, USA. It is suggested that System Dynamic modeling would be an invaluable tool for achieving sustainable freshwater resource management in individual watersheds. Through a series of thought experiments, a thorough understanding of the systems’ dynamic behaviors is obtainable for freshwater resource managers and policy-makers to examine various courses of action for alleviating freshwater supply concerns. This thesis reviews the model, its development and an analysis of several thought experiments applied to the case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Model-Driven Engineering (MDE), the developer creates a model using a language such as Unified Modeling Language (UML) or UML for Real-Time (UML-RT) and uses tools such as Papyrus or Papyrus-RT that generate code for them based on the model they create. Tracing allows developers to get insights such as which events occur and timing information into their own application as it runs. We try to add monitoring capabilities using Linux Trace Toolkit: next generation (LTTng) to models created in UML-RT using Papyrus-RT. The implementation requires changing the code generator to add tracing statements for the events that the user wants to monitor to the generated code. We also change the makefile to automate the build process and we create an Extensible Markup Language (XML) file that allows developers to view their traces visually using Trace Compass, an Eclipse-based trace viewing tool. Finally, we validate our results using three models we create and trace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiocarbon dating and Bayesian chronological modelling, undertaken as part of the investigation by the Times of Their Lives project into the development of Late Neolithic settlement and pottery in Orkney, has provided precise new dating for the Grooved Ware settlement of Barnhouse, excavated in 1985–91. Previous understandings of the site and its pottery are presented. A Bayesian model based on 70 measurements on 62 samples (of which 50 samples are thought to date accurately the deposits from which they were recovered) suggests that the settlement probably began in the later 32nd century cal bc (with Houses 2, 9, 3 and perhaps 5a), possibly as a planned foundation. Structure 8 – a large, monumental structure that differs in character from the houses – was probably built just after the turn of the millennium. Varied house durations and replacements are estimated. House 2 went out of use before the end of the settlement, and Structure 8 was probably the last element to be abandoned, probably during the earlier 29th century cal bc. The Grooved Ware pottery from the site is characterised by small, medium-sized, and large vessels with incised and impressed decoration, including a distinctive, false-relief, wavy-line cordon motif. A considerable degree of consistency is apparent in many aspects of ceramic design and manufacture over the use-life of the settlement, the principal change being the appearance, from c. 3025–2975 cal bc, of large coarse ware vessels with uneven surfaces and thick applied cordons, and of the use of applied dimpled circular pellets. The circumstances of new foundation of settlement in the western part of Mainland are discussed, as well as the maintenance and character of the site. The pottery from the site is among the earliest Grooved Ware so far dated. Its wider connections are noted, as well as the significant implications for our understanding of the timing and circumstances of the emergence of Grooved Ware, and the role of material culture in social strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel demand response model using a fuzzy subtractive cluster approach. The model development provides support to domestic consumer decisions on controllable loads management, considering consumers' consumption needs and the appropriate load shape or rescheduling in order to achieve possible economic benefits. The model based on fuzzy subtractive clustering method considers clusters of domestic consumption covering an adequate consumption range. Analysis of different scenarios is presented considering available electric power and electric energy prices. Simulation results are presented and conclusions of the proposed demand response model are discussed. (C) 2016 Elsevier Ltd. All rights reserved.