954 resultados para Metric access method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a transmission and wheeling pricing method based on the monetary flow tracing along power flow paths: the monetary flow-monetary path method. Active and reactive power flows are converted into monetary flows by using nodal prices. The method introduces a uniform measurement for transmission service usages by active and reactive powers. Because monetary flows are related to the nodal prices, the impacts of generators and loads on operation constraints and the interactive impacts between active and reactive powers can be considered. Total transmission service cost is separated into more practical line-related costs and system-wide cost, and can be flexibly distributed between generators and loads. The method is able to reconcile transmission service cost fairly and to optimize transmission system operation and development. The case study on the IEEE 30 bus test system shows that the proposed pricing method is effective in creating economic signals towards the efficient use and operation of the transmission system. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increased demand for characterizing user access patterns using web mining techniques since the informative knowledge extracted from web server log files can not only offer benefits for web site structure improvement but also for better understanding of user navigational behavior. In this paper, we present a web usage mining method, which utilize web user usage and page linkage information to capture user access pattern based on Probabilistic Latent Semantic Analysis (PLSA) model. A specific probabilistic model analysis algorithm, EM algorithm, is applied to the integrated usage data to infer the latent semantic factors as well as generate user session clusters for revealing user access patterns. Experiments have been conducted on real world data set to validate the effectiveness of the proposed approach. The results have shown that the presented method is capable of characterizing the latent semantic factors and generating user profile in terms of weighted page vectors, which may reflect the common access interest exhibited by users among same session cluster.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current image database metadata schemas require users to adopt a specific text-based vocabulary. Text-based metadata is good for searching but not for browsing. Existing image-based search facilities, on the other hand, are highly specialised and so suffer similar problems. Wexelblat's semantic dimensional spatial visualisation schemas go some way towards addressing this problem by making both searching and browsing more accessible to the user in a single interface. But the question of how and what initial metadata to enter a database remains. Different people see different things in an image and will organise a collection in equally diverse ways. However, we can find some similarity across groups of users regardless of their reasoning. For example, a search on Amazon.com returns other products also, based on an averaging of how users navigate the database. In this paper, we report on applying this concept to a set of images for which we have visualised them using traditional methods and the Amazon.com method. We report on the findings of this comparative investigation in a case study setting involving a group of randomly selected participants. We conclude with the recommendation that in combination, the traditional and averaging methods would provide an enhancement to current database visualisation, searching, and browsing facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: We have adapted the existing , optometry diabetic retinopathy screening pathway and software , so that it can be used for wet AMD fast track referral. Purpose: To compare the conventional, optometry wet AMD fast track referral service using FAX transmission, with a teleophthalmology service using colour fundus photography transmitted to a central retinal grading centre. Method: 40 optometry practices involved in diabetic retinopathy screening were enrolled and had modified computer software installed. Referrals were made by conventional fast track FAX to the macular clinic, and patients were photographed by the optometrist and images transmitted to a central grading centre Results of the two pathways were compared in terms of 1)speed of diagnosis and 2)sensitivity and specificity of diagnosis of wet AMD. Results: Over a ten month period, 62 consecutive patients were referred. The mean time for conventional pathway was 20.8 days (range 3-34),and for new teleophthalmology pathway was 6.9 days (range 1-13). Sensitivity of technician grading of images was 96%, Specificity 53%, and consultant ophthalmologist was sensitivity 96%, specificiity 87%. The technician showed a learning effect with specificity increasing from 30.7% for first 31 patient cohort, to 70.6% for the second cohort. One patient had images that could not be graded. Conclusion: Rapid referral of wet AMD cases by optometrists using modified diabetic retinopathy screening software, allows fast and accurate diagnosis, and may reduce unnecessary referrals. Retinal grading technicians can be trained to grade wet AMD images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research focuses on automatically adapting a search engine size in response to fluctuations in query workload. Deploying a search engine in an Infrastructure as a Service (IaaS) cloud facilitates allocating or deallocating computer resources to or from the engine. Our solution is to contribute an adaptive search engine that will repeatedly re-evaluate its load and, when appropriate, switch over to a dierent number of active processors. We focus on three aspects and break them out into three sub-problems as follows: Continually determining the Number of Processors (CNP), New Grouping Problem (NGP) and Regrouping Order Problem (ROP). CNP means that (in the light of the changes in the query workload in the search engine) there is a problem of determining the ideal number of processors p active at any given time to use in the search engine and we call this problem CNP. NGP happens when changes in the number of processors are determined and it must also be determined which groups of search data will be distributed across the processors. ROP is how to redistribute this data onto processors while keeping the engine responsive and while also minimising the switchover time and the incurred network load. We propose solutions for these sub-problems. For NGP we propose an algorithm for incrementally adjusting the index to t the varying number of virtual machines. For ROP we present an ecient method for redistributing data among processors while keeping the search engine responsive. Regarding the solution for CNP, we propose an algorithm determining the new size of the search engine by re-evaluating its load. We tested the solution performance using a custom-build prototype search engine deployed in the Amazon EC2 cloud. Our experiments show that when we compare our NGP solution with computing the index from scratch, the incremental algorithm speeds up the index computation 2{10 times while maintaining a similar search performance. The chosen redistribution method is 25% to 50% faster than other methods and reduces the network load around by 30%. For CNP we present a deterministic algorithm that shows a good ability to determine a new size of search engine. When combined, these algorithms give an adapting algorithm that is able to adjust the search engine size with a variable workload.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2010, a household survey was carried out in Hungary among 1037 respondents to study consumer preferences and willingness to pay for health care services. In this paper, we use the data from the discrete choice experiments included in the survey, to elicit the preferences of health care consumers about the choice of health care providers. Regression analysis is used to estimate the effect of the improvement of service attributes (quality, access, and price) on patients’ choice, as well as the differences among the socio-demographic groups. We also estimate the marginal willingness to pay for the improvement in attribute levels by calculating marginal rates of substitution. The results show that respondents from a village or the capital, with low education and bad health status are more driven by the changes in the price attribute when choosing between health care providers. Respondents value the good skills and reputation of the physician and the attitude of the personnel most, followed by modern equipment and maintenance of the office/hospital. Access attributes (travelling and waiting time) are less important. The method of discrete choice experiment is useful to reveal patients’ preferences, and might support the development of an evidence-based and sustainable health policy on patient payments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an effort to improve instruction and better accommodate the needs of students, community colleges are offering courses delivered in a variety of delivery formats that require students to have some level of technology fluency to be successful in the course. This study was conducted to investigate the relationship between student socioeconomic status (SES), course delivery method, and course type on enrollment, final course grades, course completion status, and course passing status at a state college. ^ A dataset for 20,456 students of low and not low SES enrolled in science, technology, engineering, and mathematics (STEM) course types delivered using traditional, online, blended, and web enhanced course delivery formats at Miami Dade College, a large open access 4-year state college located in Miami-Dade County, Florida, was analyzed. A factorial ANOVA using course type, course delivery method, and student SES found no significant differences in final course grades when used to determine if course delivery methods were equally effective for students of low and not low SES taking STEM course types. Additionally, three chi-square goodness-of-fit tests were used to investigate for differences in enrollment, course completion and course passing status by SES, course type, and course delivery method. The findings of the chi-square tests indicated that: (a) there were significant differences in enrollment by SES and course delivery methods for the Engineering/Technology, Math, and overall course types but not for the Natural Science course type and (b) there were no significant differences in course completion status and course passing status by SES and course types overall and SES and course delivery methods overall. However, there were statistically significant but weak relationships between course passing status, SES and the math course type as well as between course passing status, SES, and online and traditional course delivery methods. ^ The mixed findings in the study indicate that strides have been made in closing the theoretical gap in education and technology skills that may exist for students of different SES levels. MDC's course delivery and student support models may assist other institutions address student success in courses that necessitate students having some level of technology fluency. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Absolute abundances (concentrations) of dinoflagellate cysts are often determined through the addition of Lycopodium clavatum marker-grains as a spike to a sample before palynological processing. An inter-laboratory calibration exercise was set up in order to test the comparability of results obtained in different laboratories, each using its own preparation method. Each of the 23 laboratories received the same amount of homogenized splits of four Quaternary sediment samples. The samples originate from different localities and consisted of a variety of lithologies. Dinoflagellate cysts were extracted and counted, and relative and absolute abundances were calculated. The relative abundances proved to be fairly reproducible, notwithstanding a need for taxonomic calibration. By contrast, excessive loss of Lycopodium spores during sample preparation resulted in non-reproducibility of absolute abundances. Use of oxidation, KOH, warm acids, acetolysis, mesh sizes larger than 15 µm and long ultrasonication (> 1 min) must be avoided to determine reproducible absolute abundances. The results of this work therefore indicate that the dinoflagellate cyst worker should make a choice between using the proposed standard method which circumvents critical steps, adding Lycopodium tablets at the end of the preparation and using an alternative method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mosaic of two WorldView-2 high resolution multispectral images (Acquisition dates: October 2010 and April 2012), in conjunction with field survey data, was used to create a habitat map of the Danajon Bank, Philippines (10°15'0'' N, 124°08'0'' E) using an object-based approach. To create the habitat map, we conducted benthic cover (seafloor) field surveys using two methods. Firstly, we undertook georeferenced point intercept transects (English et al., 1997). For ten sites we recorded habitat cover types at 1 m intervals on 10 m long transects (n= 2,070 points). Second, we conducted geo-referenced spot check surveys, by placing a viewing bucket in the water to estimate the percent cover benthic cover types (n = 2,357 points). Survey locations were chosen to cover a diverse and representative subset of habitats found in the Danajon Bank. The combination of methods was a compromise between the higher accuracy of point intercept transects and the larger sample area achievable through spot check surveys (Roelfsema and Phinn, 2008, doi:10.1117/12.804806). Object-based image analysis, using the field data as calibration data, was used to classify the image mosaic at each of the reef, geomorphic and benthic community levels. The benthic community level segregated the image into a total of 17 pure and mixed benthic classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coral reefs are increasingly threatened by global and local anthropogenic stressors, such as rising seawater temperature and nutrient enrichment. These two stressors vary widely across the reef face and parsing out their influence on coral communities at reef system scales has been particularly challenging. Here, we investigate the influence of temperature and nutrients on coral community traits and life history strategies on lagoonal reefs across the Belize Mesoamerican Barrier Reef System (MBRS). A novel metric was developed using ultra-high-resolution sea surface temperatures (SST) to classify reefs as enduring low (lowTP), moderate (modTP), or extreme (extTP) temperature parameters over 10 years (2003 to 2012). Chlorophyll-a (chl a) records obtained for the same interval were employed as a proxy for bulk nutrients and these records were complemented with in situ measurements to "sea truth" nutrient content across the three reef types. Chl a concentrations were highest at extTP sites, medial at modTP sites and lowest at lowTP sites. Coral species richness, abundance, diversity, density, and percent cover were lower at extTP sites compared to lowTP and modTP sites, but these reef community traits did not differ between lowTP and modTP sites. Coral life history strategy analyses showed that extTP sites were dominated by hardy stress-tolerant and fast-growing weedy coral species, while lowTP and modTP sites consisted of competitive, generalist, weedy, and stress-tolerant coral species. These results suggest that differences in coral community traits and life history strategies between extTP and lowTP/modTP sites were driven primarily by temperature differences with differences in nutrients across site types playing a lesser role. Dominance of weedy and stress-tolerant genera at extTP sites suggests that corals utilizing these two life history strategies may be better suited to cope with warmer oceans and thus may warrant further protective status during this climate change interval. Data associated with this project are archived here, including: -SST data -Satellite Chl a data -Nutrient measurements -Raw coral community survey data For questions contact Justin Baumann (j.baumann3 gmail.com)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological developments in biomedical microsystems are opening up new opportunities to improve healthcare procedures. Swallowable diagnostic capsules are an example of this. In this paper, a diagnostic capsule technology is described based on direct-access sensing of the Gastro Intestinal (GI) fluids throughout the GI tract. The objective of this paper is two-fold: i) develop a packaging method for a direct access sensor, ii) develop an encapsulation method to protect the system electronics. The integrity of the interconnection after sensor packaging and encapsulation is correlated to its reliability and thus of importance. The zero level packaging of the sensor was achieved by using a so called Flip Chip Over Hole (FCOH) method. This allowed the fluidic sensing media to interface with the sensor, while the rest of the chip including the electrical connections can be insulated effectively. Initial tests using Anisotropic Conductive Adhesive (ACA) interconnect for the FCOH demonstrated good electrical connections and functionality of the sensor chip. Also a preliminary encapsulation trial of the flip chipped sensor on a flexible test substrate has been carried out and showed that silicone encapsulation of the system is a viable option.