71 resultados para VERIFICATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. The prediction and mapping of climate in areas between climate stations is of increasing importance in ecology.

2. Four categories of model, simple interpolation, thin plate splines, multiple linear regression and mixed spline-regression, were tested for their ability to predict the spatial distribution of temperature on the British mainland. The models were tested by external cross-verification.

3. The British distribution of mean daily temperature was predicted with the greatest accuracy by using a mixed model: a thin plate spline fitted to the surface of the country, after correction of the data by a selection from 16 independent topographical variables (such as altitude, distance from the sea, slope and topographic roughness), chosen by multiple regression from a digital terrain model (DTM) of the country.

4. The next most accurate method was a pure multiple regression model using the DTM. Both regression and thin plate spline models based on a few variables (latitude, longitude and altitude) only were comparatively unsatisfactory, but some rather simple methods of surface interpolation (such as bilinear interpolation after correction to sea level) gave moderately satisfactory results. Differences between the methods seemed to be dependent largely on their ability to model the effect of the sea on land temperatures.

5. Prediction of temperature by the best methods was greater than 95% accurate in all months of the year, as shown by the correlation between the predicted and actual values. The predicted temperatures were calculated at real altitudes, not subject to sea-level correction.

6. A minimum of just over 30 temperature recording stations would generate a satisfactory surface, provided the stations were well spaced.

7. Maps of mean daily temperature, using the best overall methods are provided; further important variables, such as continentality and length of growing season, were also mapped. Many of these are believed to be the first detailed representations at real altitude.

8. The interpolated monthly temperature surfaces are available on disk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is acknowledged that one of the consequences of the ageing process is cognitive decline, which leads to an increase in the incidence of illnesses such as dementia. This has become ever more relevant due to the projected increase in the ageing demographic. Dementia affects visuo-spatial perception, causing difficulty with wayfinding, even during the early stages of the disease. The literature widely recognises the physical environment’s role in alleviating symptoms of dementia and improving quality of life for residents. It also identifies the lack of available housing options for older people with dementia and consequently the current stock is ill-equipped to provide adequate support.
Recent statistics indicate that 80% of those residing in nursing or residential care homes have some form of dementia or severe memory problems. The shift towards institutional care settings, the need for specialist support and care, places a greater impetus on the need for a person-centred approach to tackle issues related to wayfinding and dementia.
This thesis therefore aims to improve design for dementia in nursing and residential care settings in the context of Northern Ireland. This will be undertaken in order to provide a better understanding of how people with dementia experience the physical environment and to highlight features of the design that assist with wayfinding. Currently there are limited guidelines on design for dementia, meaning that many of these are theoretical, anecdotal and not definitive. Hence a greater verification to address the less recognised design issues is required. This is intended to ultimately improve quality of life, wellbeing, independence and uphold the dignity of people with dementia living in nursing or residential care homes.
The research design uses a mixed methods approach. A thorough preparation and consideration of ethical issues informed the methodology. The various facets were also trialled and piloted to identify any ethical, technological, methodological, data collection and analysis issues. The protocol was then amended to improve or resolve any of the aforementioned issues. Initially a questionnaire based on leading design recommendations was conducted with home managers. Semi-structured interviews were developed from this and conducted with staff and resident’s next of kin. An evidence-based approach was used to design a study which used ethnographic methods, including a wayfinding task. This followed a repeated measures design which would be used to actively engage residents with dementia in the research. Complementary to the wayfinding task, conversational and semi-structured interviews were used to promote dialogue and direct responses with the person with dementia. In addition to this, Space Syntax methodologies were used to examine the physical properties of the architectural layout. This was then cross-examined with interview responses and data from the wayfinding tasks.
A number of plan typologies were identified and were determined as synonymous with decision point types which needed to be made during the walks. The empirical work enabled the synthesis of environmental features which support wayfinding.
Results indicate that particular environmental features are associated with improved performance on the wayfinding tasks. By enhancing design for dementia, through identifying the attributes, challenges with wayfinding may be overcome and the benefits of the physical environment can be seen to promote wellbeing.
The implications of this work mean that the environmental features which have been highlighted from the project can be used to inform guidelines, thus adding to existing knowledge. Future work would involve the dissemination of this information and the potential for it to be made into design standards or regulations which champion design for dementia. These would increase awareness for designers and stakeholders undertaking new projects, extensions or refurbishments.
A person-centred, evidence-based design was emphasised throughout the project which guaranteed an in-depth study. There were limitations due to the available resources, time and funding. Future research would involve testing the identified environmental features within a specific environment to enable measured observation of improvements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a novel approach to face recognition which simultaneously tackles three combined challenges: 1) uneven illumination; 2) partial occlusion; and 3) limited training data. The new approach performs lighting normalization, occlusion de-emphasis and finally face recognition, based on finding the largest matching area (LMA) at each point on the face, as opposed to traditional fixed-size local area-based approaches. Robustness is achieved with novel approaches for feature extraction, LMA-based face image comparison and unseen data modeling. On the extended YaleB and AR face databases for face identification, our method using only a single training image per person, outperforms other methods using a single training image, and matches or exceeds methods which require multiple training images. On the labeled faces in the wild face verification database, our method outperforms comparable unsupervised methods. We also show that the new method performs competitively even when the training images are corrupted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, novel closed-form expressions for the level crossing rate and average fade duration of κ − μ shadowed fading channels are derived. The new equations provide the capability of modeling the correlation between the time derivative of the shadowed dominant and multipath components of the κ − μ shadowed fading envelope. Verification of the new equations is performed by reduction to a number of known special cases. It is shown that as the shadowing of the resultant dominant component decreases, the signal crosses lower threshold levels at a reduced rate. Furthermore, the impact of increasing correlation between the slope of the shadowed dominant and multipath components similarly acts to reduce crossings at lower signal levels. The new expressions for the second-order statistics are also compared with field measurements obtained for cellular device-to-device and body-centric communication channels, which are known to be susceptible to shadowed fading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper demonstrates the unparalleled value of full scale data which has been acquired from ocean trials of Aquamarine Power’s Oyster 800 Wave Energy Converter (WEC) at the European Marine Energy Centre (EMEC), Orkney, Scotland.
High quality prototype and wave data were simultaneously recorded in over 750 distinct sea states (comprising different wave height, wave period and tidal height combinations) and include periods of operation where the hydraulic Power Take-Off (PTO) system was both pressurised (damped operation) and de-pressurised (undamped operation).
A detailed model-prototype correlation procedure is presented where the full scale prototype behaviour is compared to predictions from both experimental and numerical modelling techniques via a high temporal resolution wave-by-wave reconstruction. This unquestionably provides the definitive verification of the capabilities of such research techniques and facilitates a robust and meaningful uncertainty analysis to be performed on their outputs.
The importance of a good data capture methodology, both in terms of handling and accuracy is also presented. The techniques and procedures implemented by Aquamarine Power for real-time data management are discussed, including lessons learned on the instrumentation and infrastructure required to collect high-value data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, the PTW 1000SRS array with Octavius 4D phantom was characterised for FF and FFF beams. MU linearity, field size, dose rate, dose per pulse (DPP) response and dynamic conformal arc treatment accuracy of the 1000SRS array were assessed for 6MV, 6FFF and 10FFF beams using a Varian TrueBeam STx linac. The measurements were compared with a pinpoint IC, microdiamond IC and EBT3 Gafchromic film. Measured dose profiles and FWHMs were compared with film measurements. Verification of FFF volumetric modulated arc therapy (VMAT) clinical plans were assessed using gamma analysis with 3%/3 mm and 2%/2 mm tolerances (10% threshold). To assess the effect of cross calibration dose rate, clinical plans with different dose rates were delivered and analysed. Output factors agreed with film measurements to within 4.5% for fields between 0.5 and 1 cm and within 2.7% for field sizes between 1.5 and 10 cm and were highly correlated with the microdiamond IC detector. Field sizes measured with the 1000SRS array were within 0.5 mm of film measurements. A drop in response of up to 1.8%, 2.4% and 5.2% for 6MV, 6FFF and 10FFF beams respectively was observed with increasing nominal dose rate. With an increase in DPP, a drop of up to 1.7%, 2.4% and 4.2% was observed in 6MV, 6FFF and 10FFF respectively. The differences in dose following dynamic conformal arc deliveries were less than 1% (all energies) from calculated. Delivered VMAT plans showed an average pass percentage of 99.5(±0.8)% and 98.4(±3.4)% with 2%/2 mm criteria for 6FFF and 10FFF respectively. A drop to 97.7(±2.2)% and 88.4(±9.6)% were observed for 6FFF and 10FFF respectively when plans were delivered at the minimum dose rate and calibrated at the maximum dose rate. Calibration using a beam with the average dose rate of the plan may be an efficient method to overcome the dose rate effects observed by the 1000SRS array.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the rapid development of internet-of-things (IoT), face scrambling has been proposed for privacy protection during IoT-targeted image/video distribution. Consequently in these IoT applications, biometric verification needs to be carried out in the scrambled domain, presenting significant challenges in face recognition. Since face models become chaotic signals after scrambling/encryption, a typical solution is to utilize traditional data-driven face recognition algorithms. While chaotic pattern recognition is still a challenging task, in this paper we propose a new ensemble approach – Many-Kernel Random Discriminant Analysis (MK-RDA) to discover discriminative patterns from chaotic signals. We also incorporate a salience-aware strategy into the proposed ensemble method to handle chaotic facial patterns in the scrambled domain, where random selections of features are made on semantic components via salience modelling. In our experiments, the proposed MK-RDA was tested rigorously on three human face datasets: the ORL face dataset, the PIE face dataset and the PUBFIG wild face dataset. The experimental results successfully demonstrate that the proposed scheme can effectively handle chaotic signals and significantly improve the recognition accuracy, making our method a promising candidate for secure biometric verification in emerging IoT applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(abreviated) We aim to study the inner-wind structure (R<250 Rstar) of the well-known red supergiant VY CMa. We analyse high spatial resolution (~0".24x0".13) ALMA Science Verification (SV) data in band 7 in which four thermal emission lines of gaseous sodium chloride (NaCl) are present at high signal-to-noise ratio. For the first time, the NaCl emission in the inner wind region of VY CMa is spatially resolved. The ALMA observations reveal the contribution of up to four different spatial regions. The NaCl emission pattern is different compared to the dust continuum and TiO2 emission already analysed from the ALMA SV data. The emission can be reconciled with an axisymmetric geometry, where the lower density polar/rotation axis has a position angle of ~50 degrees measured from north to east. However, this picture can not capture the full morphological diversity, and discrete mass ejection events need to be invoked to explain localized higher-density regions. The velocity traced by the gaseous NaCl line profiles is significantly lower than the average wind terminal velocity, and much slower than some of the fastest mass ejections, signalling a wide range of characteristic speeds for the mass loss. Gaseous NaCl is detected far beyond the main dust condensation region. Realising the refractory nature of this metal halide, this hints at a chemical process preventing all NaCl from condensing onto dust grains. We show that in the case of the ratio of the surface binding temperature to the grain temperature being ~50, only some 10% of NaCl remains in gaseous form, while for lower values of this ratio thermal desorption efficiently evaporates NaCl. Photodesorption by stellar photons seems not to be a viable explanation for the detection of gaseous NaCl at 220 Rstar from the central star, and instead, we propose shock-induced sputtering driven by localized mass ejection events as alternative.