919 resultados para website usability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new technique called‘Tilt Menu’ for better extending selection capabilities of pen-based interfaces.The Tilt Menu is implemented by using 3D orientation information of pen devices while performing selection tasks.The Tilt Menu has the potential to aid traditional onehanded techniques as it simultaneously generates the secondary input (e.g., a command or parameter selection) while drawing/interacting with a pen tip without having to use the second hand or another device. We conduct two experiments to explore the performance of the Tilt Menu. In the first experiment, we analyze the effect of parameters of the Tilt Menu, such as the menu size and orientation of the item, on its usability. Results of the first experiment suggest some design guidelines for the Tilt Menu. In the second experiment, the Tilt Menu is compared to two types of techniques while performing connect-the-dot tasks using freeform drawing mechanism. Results of the second experiment show that the Tilt Menu perform better in comparison to the Tool Palette, and is as good as the Toolglass.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

提出一种在界面系统设计规约的基础上使用的可用性评估方法.首先使用有限状态自动机抽象界面系统设计,根据概率规则文法对有限状态自动机的状态转换概率进行预测;然后结合用户的熟练程度提出了界面可用性评估算法;最后讨论了一个手机界面的可用性计算实例.文中方法能够在界面系统生命周期的早期使用,以较早地对不同设计方案进行比较,降低开发风险.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pen-based user interface (PUI) has drawn significant interest, owing to its intuitiveness and convenience. While much of the research focuses on the technology, the usability of a PUI has been relatively low since human factors have not been considered sufficiently. Scenario-centric designs are ideal ways to improve usability. However, such designs possess some problems in practical use. To cope with these design issues, the concept of “interface scenarios” is proposed in to facilitate the interface design, and to help users understand the interaction process in such designs. The proposed scenario-focused development method for PUI is coupled with a practical application to show its effectiveness and usability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this dissertation, we investigated two types of traveling ionospheric disturbances (TIDs)/gravity waves (GWs) triggered separately by auroral energy input during super geomagnetic storms and solar terminator (ST) under quiet geomagnetic conditions (kp<3+) using TEC measurements from the global network of GPS receivers. Research into the generation and propagation of TIDs/GWs during storms greatly enhance our understandings on the evolution processes of energy transportation from the high-latitude’s magnetosphere to the low-latitude ionosphere and the conjugated effect of TIDs propagation between the northern and southern hemispheres. Our results revealed that the conjugacy of propagation direction between the northern and southern hemispheres was subject to the influence of Coriolis force. We also figure out the evolution processes of ionospheric disturbances at the global scale. These are important topics that had not been well addressed previously. In addition, we also obtained thee wave structures of medium scale TIDs excited by the solar terminator (ST) moving over the northern America and physical mechanisms involved. Our observations confirm that the ST is a stable and repetitive source of ionospheric wave disturbances and the evidence of solar terminator generated disturbances has been demonstrated experimentally via the GPS TEC measurement. The main researches and results of this dissertation are as follows. First, the global traveling ionospheric disturbances (TIDs) during the drastic magnetic storms of October 29–31, 2003 were analyzed using the Global Position System (GPS) total electron content (TEC) data observed in the Asian-Australian, European and North American sectors. We collected the most comprehensive set of the TEC data from more than 900 GPS stations on the International GNSS Services (IGS) website and introduce here a strategy that combines polynomial fitting and multi-channel maximum entropy spectral analysis to obtain TID parameters. Moreover, in collaboration with my thesis advisor, I have developed an imaging technique of 2-dimensional map of TIDs structures to obtain spatial and temporal maps of large scale traveling ionospheric disturbances (LSTIDs). The clear structures of TEC perturbations map during the passage of TIDs were displayed. The results of our study are summarized as follows: (1) Large-scale TIDs (LSTIDs) and medium-scale TIDs (MSTIDs) were detected in all three sectors after the sudden commencement (SC) of the magnetic storm, and their features showed longitudinal and latitudinal dependences. The duration of TIDs was longer at higher latitudes than at middle latitudes, with a maximum of about 16 h. The TEC variation amplitude of LSTIDs was larger in the North American sector than in the two other sectors. At the lower latitudes, the ionospheric perturbations were more complicated, and their duration and amplitude were relatively longer and larger. (2) The periods and phase speeds of TIDs were different in these three sectors. In Europe, the TIDs propagated southward; in North America and Asia, the TIDs propagated southwestward; in the near-equator region, the disturbances propagated with the azimuth (the angle of the propagation direction of the LSTIDs measured clockwise from due north with 0°) of 210° showing the influence of Coriolis force; in the Southern Hemisphere, the LSTIDs propagated conjugatedly northwestward. Both the southwestward and northeastward propagating LSTIDs are found in the equatorial region. These results mean that the Coriolis effect cannot be ignored for the wave propagation of LSTIDs and that the propagation direction is correlated with the polar magnetic activity. (3) The day (day of year: 301) before the SC (sudden commencement) of magnetic storm, we observed a sudden TEC skip disturbances (±10 TECU). It should be a response for the high flux of proton during the solar flare event, but not the magnetic storms. Next, the most comprehensive and dense GPS network’s data from North-America region were used in this paper to analyze the medium scale traveling ionospheric disturbances (MSTIDs) which were generated by the moving solar terminator during the quiet days in 2005. We applied the multi-channel maximum entropy spectral analysis to calculated TID parameters, and found that the occurrence of ST-MSTIDs depends on the seasonal variations. The results of our study are summarized as follows: (1) MSTIDs stimulated by the moving ST (ST-MSTIDs) are detected at mid-latitudes after the passage of the solar terminator with the life time of 2~3 hours and the variation amplitude of 0.2~0.8 TECU. Spectral analysis indicated that the horizontal wavelength, average period, horizontal phase velocity of the MSTIDs are around 300±150 km,150±80 m/s and 25±15 min, respectively. In addition, ST-MSTIDs have wave fronts elongating the moving ST direction and almost parallel to ST. (2) The statistical results demonstrate that the dusk MSTIDs stimulated by ST is more obvious than the dawn MSTIDs in summer. On the contrary, the more-pronounced dawn MSTIDs occurs in winter. (3) Further analysis indicates that the seasonal variations of ST-MSTIDs occurrence frequency are most probably related to the seasonal differences of the variations of EUV flux in the ionosphere region and recombination process during sunrise and sunset period at mid-latitudes. Statistical study of occurrence characteristics of TIDs using the GPS network in North-American and European during solar maximum, In conclusion, statistical studies of the propagation characteristics of TIDs, which excited by the two common origins including geomagnetic storms and moving solar terminator, were involved with global GPS TEC databasein this thesis. We employed the multichannel maximum entropy spectral analysis method to diagnose the characteristics of propagation and evolvement of ionospheric disturbances, also, the characteristics of their regional distribution and climatological variations were revealed by the statistic analysis. The results of these studies can improve our knowledge about the energy transfer in the solar-terrestrial system and the coupling process between upper and lower atmosphere (thermosphere-ionosphere-mesosphere). On the other hand, our results of the investigation on TIDs generated by particular linear origin such as ST are important for developing ionospheric irregularity physics and modeling the transionosphere radio wave propagation. Besides, the GPS TEC representation of the ST-generated ionospheric structure suggests a better possibility for investigating this phenomenon. Subsequently, there are scientific meaning of the result of this dissertation to deeply discuss the energy transfer and coupling in the ionosphere, as well as realistic value to space weather forecast in the ionosphere region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research addresses the problems of public policy-making procedures. In conducting our research, we considered public policy as the allocation or reallocation of interests or resources among different members of the public. Due to limited resources, administrations should trade off all interests among different segments of society when formulating a policy. Unfortunately, in recent years there have been several mass conflicts with administration of public policy. This infers that some people’s interests were ignored or harmed by certain policies. According to the theory of procedural justice, people may accept the unexpected result if they consider the procedure is just. This research hypothesizes that there are certain problems in current policy-making procedures and that improving these procedures may make policies more acceptable. A pilot study was conducted by interviewing ten scholars from a range of disciplines. The interview record transcripts were coded by three analysts. The results indicate that: 1) Most of the scholars criticized current public policies as lacking sensitivity to public issues; 2) Most of them considered that current public policies do not resolve problems effectively; and 3) They all considered that psychology research may enhance awareness of public issues and improve the effectiveness of policy. In study 2, the procedure of public policy was tracked and compared with a social survey. The Beijing government would like to increase the taxi fare rate to cope with the rising price of petroleum. Although the majority of delegates in a hearing of witnesses supported the policy consideration, the social survey of 186 residents and 63 taxi drivers indicated that both of them oppose the consideration. The findings indicate that the hearing of witnesses was not able to delegate the opinions of the public, resulting in the policy failing to resolve the problem. Study 3 was a nonequivalent control group quasi-experiment. Visitors of two Internet Website were chosen as subjects for original photo games. For the experiment group, visitors were invited to express their desires and suggestions on the game rules for one week, and then declare rules referencing the suggestions before starting the game. Meanwhile, the control group simply declared the rules at the beginning of the game. Compared with the two games during 23 days, the experiment group submitted more photos than the control group. The results of this research imply that, the good will of policy makers is not enough to make a policy effective. Surveys on public attitudes at the beginning of the policy-making process can allow policy makers to better determine public issues, assess the tradeoff of public interests, help ensure policies are more acceptable, and help foster a harmonious society. The authors of this research suggest that psychology research should take more social level problems into account in the policy-making process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O trabalho apresenta as informações reunidas, tratadas e disponibilizadas na forma de um website, que pode ser visto como um instrumento de informação, com vistas à gestão ambiental do município. Ou seja, o website elaborado e disponibilizado pode ser uma ferramenta federadora das diversas iniciativas locais. O website faz parte do conjunto de instrumentos para subsidiar a elaboração da Agenda 21 do Município de Campinas-SP, e disponibilizar, de forma simples e transparente, informações sobre meio ambiente municipal para todos os cidadãos. Os dados numéricos, cartográficos e iconográficos apresentados no website têm sempre um caráter transitório, pois são passíveis de atualização, novas precisões e enriquecimentos. Portanto, devem ser considerados como documentos de trabalho, e que confere um caráter inovador de utilização da Internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Examples submitted by invitation to a website intended as a genera; resource in Australasia I'm afraid I do not have a pre-publication copy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Considerable effort is required to implement solar radiation models in software. Many existing implementations have efficiency as their main priority rather than re-usability, and this can adversely affect their further development since the relationships between the software and physical quantities may be obscured. The Solar Toolkit is an attempt to overcome such barriers by exploiting the current abundance of computing resource, and the availability of user-oriented tools such as Microsoft Excel®. The Solar Toolkit takes the form of a set of functions written in Visual Basic for Applications® (VBA) made available under the Academic Free Licence. Transparency is the overriding priority throughout the implementation so that the Toolkit can provide a platform for further modelling initiatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thomas, R. & Urquhart, C. NHS Wales e-library portal evaluation. (For Informing Healthcare Strategy implementation programme). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth Follow-on to NHS Wales User Needs study Sponsorship: Informing Healthcare, NHS Wales

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Durbin, J., Urquhart, C. & Yeoman, A. (2003). Evaluation of resources to support production of high quality health information for patients and the public. Final report for NHS Research Outputs Programme. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Department of Health

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The SIEGE (Smoking Induced Epithelial Gene Expression) database is a clinical resource for compiling and analyzing gene expression data from epithelial cells of the human intra-thoracic airway. This database supports a translational research study whose goal is to profile the changes in airway gene expression that are induced by cigarette smoke. RNA is isolated from airway epithelium obtained at bronchoscopy from current-, former- and never-smoker subjects, and hybridized to Affymetrix HG-U133A Genechips, which measure the level of expression of ~22 500 human transcripts. The microarray data generated along with relevant patient information is uploaded to SIEGE by study administrators using the database's web interface, found at http://pulm.bumc.bu.edu/siegeDB. PERL-coded scripts integrated with SIEGE perform various quality control functions including the processing, filtering and formatting of stored data. The R statistical package is used to import database expression values and execute a number of statistical analyses including t-tests, correlation coefficients and hierarchical clustering. Values from all statistical analyses can be queried through CGI-based tools and web forms found on the �Search� section of the database website. Query results are embedded with graphical capabilities as well as with links to other databases containing valuable gene resources, including Entrez Gene, GO, Biocarta, GeneCards, dbSNP and the NCBI Map Viewer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND:Cardiovascular disease (CVD) and its most common manifestations - including coronary heart disease (CHD), stroke, heart failure (HF), and atrial fibrillation (AF) - are major causes of morbidity and mortality. In many industrialized countries, cardiovascular disease (CVD) claims more lives each year than any other disease. Heart disease and stroke are the first and third leading causes of death in the United States. Prior investigations have reported several single gene variants associated with CHD, stroke, HF, and AF. We report a community-based genome-wide association study of major CVD outcomes.METHODS:In 1345 Framingham Heart Study participants from the largest 310 pedigrees (54% women, mean age 33 years at entry), we analyzed associations of 70,987 qualifying SNPs (Affymetrix 100K GeneChip) to four major CVD outcomes: major atherosclerotic CVD (n = 142; myocardial infarction, stroke, CHD death), major CHD (n = 118; myocardial infarction, CHD death), AF (n = 151), and HF (n = 73). Participants free of the condition at entry were included in proportional hazards models. We analyzed model-based deviance residuals using generalized estimating equations to test associations between SNP genotypes and traits in additive genetic models restricted to autosomal SNPs with minor allele frequency [greater than or equal to]0.10, genotype call rate [greater than or equal to]0.80, and Hardy-Weinberg equilibrium p-value [greater than or equal to] 0.001.RESULTS:Six associations yielded p <10-5. The lowest p-values for each CVD trait were as follows: major CVD, rs499818, p = 6.6 x 10-6; major CHD, rs2549513, p = 9.7 x 10-6; AF, rs958546, p = 4.8 x 10-6; HF: rs740363, p = 8.8 x 10-6. Of note, we found associations of a 13 Kb region on chromosome 9p21 with major CVD (p 1.7 - 1.9 x 10-5) and major CHD (p 2.5 - 3.5 x 10-4) that confirm associations with CHD in two recently reported genome-wide association studies. Also, rs10501920 in CNTN5 was associated with AF (p = 9.4 x 10-6) and HF (p = 1.2 x 10-4). Complete results for these phenotypes can be found at the dbgap website http://www.ncbi.nlm.nih.gov/projects/gap/cgi-bin/study.cgi?id=phs000007.CONCLUSION:No association attained genome-wide significance, but several intriguing findings emerged. Notably, we replicated associations of chromosome 9p21 with major CVD. Additional studies are needed to validate these results. Finding genetic variants associated with CVD may point to novel disease pathways and identify potential targeted preventive therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND:The Framingham Heart Study (FHS), founded in 1948 to examine the epidemiology of cardiovascular disease, is among the most comprehensively characterized multi-generational studies in the world. Many collected phenotypes have substantial genetic contributors; yet most genetic determinants remain to be identified. Using single nucleotide polymorphisms (SNPs) from a 100K genome-wide scan, we examine the associations of common polymorphisms with phenotypic variation in this community-based cohort and provide a full-disclosure, web-based resource of results for future replication studies.METHODS:Adult participants (n = 1345) of the largest 310 pedigrees in the FHS, many biologically related, were genotyped with the 100K Affymetrix GeneChip. These genotypes were used to assess their contribution to 987 phenotypes collected in FHS over 56 years of follow up, including: cardiovascular risk factors and biomarkers; subclinical and clinical cardiovascular disease; cancer and longevity traits; and traits in pulmonary, sleep, neurology, renal, and bone domains. We conducted genome-wide variance components linkage and population-based and family-based association tests.RESULTS:The participants were white of European descent and from the FHS Original and Offspring Cohorts (examination 1 Offspring mean age 32 +/- 9 years, 54% women). This overview summarizes the methods, selected findings and limitations of the results presented in the accompanying series of 17 manuscripts. The presented association results are based on 70,897 autosomal SNPs meeting the following criteria: minor allele frequency [greater than or equal to] 10%, genotype call rate [greater than or equal to] 80%, Hardy-Weinberg equilibrium p-value [greater than or equal to] 0.001, and satisfying Mendelian consistency. Linkage analyses are based on 11,200 SNPs and short-tandem repeats. Results of phenotype-genotype linkages and associations for all autosomal SNPs are posted on the NCBI dbGaP website at http://www.ncbi.nlm.nih.gov/projects/gap/cgi-bin/study.cgi?id=phs000007.CONCLUSION:We have created a full-disclosure resource of results, posted on the dbGaP website, from a genome-wide association study in the FHS. Because we used three analytical approaches to examine the association and linkage of 987 phenotypes with thousands of SNPs, our results must be considered hypothesis-generating and need to be replicated. Results from the FHS 100K project with NCBI web posting provides a resource for investigators to identify high priority findings for replication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In college courses dealing with material that requires mathematical rigor, the adoption of a machine-readable representation for formal arguments can be advantageous. Students can focus on a specific collection of constructs that are represented consistently. Examples and counterexamples can be evaluated. Assignments can be assembled and checked with the help of an automated formal reasoning system. However, usability and accessibility do not have a high priority and are not addressed sufficiently well in the design of many existing machine-readable representations and corresponding formal reasoning systems. In earlier work [Lap09], we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. We report on our attempt to evaluate our proposed design criteria by deploying within the classroom a lightweight formal verification system designed according to these criteria. The lightweight formal verification system was used within the instruction of a common application of formal reasoning: proving by induction formal propositions about functional code. We present all of the formal reasoning examples and assignments considered during this deployment, most of which are drawn directly from an introductory text on functional programming. We demonstrate how the design of the system improves the effectiveness and understandability of the examples, and how it aids in the instruction of basic formal reasoning techniques. We make brief remarks about the practical and administrative implications of the system’s design from the perspectives of the student, the instructor, and the grader.