843 resultados para Spam email filtering
Resumo:
Intention-based models have been one of the main theoretical orientations in the research on the implementation of information and communication technology (ICT). According to these models, actual behavior can be predicted from the intention towards the behavior. If the level of intention to use technology is high, the probability of actual usage of ICT increases. The purpose of this study was to find out which factors explain vocational teachers intention to use ICT in their teaching. In addition, teachers of media and information sciences and teachers of welfare and health were compared. The study also explored how regularly ICT was applied by teachers and how strong their intention to apply the technology was. This Master s thesis is a quantitative study and the data was collected using an Email survey and Eform. The instruments were based on a decomposed theory of planned behavior. The research group consisted of 22 schools of media and information sciences and 20 schools of welfare and health. The data consisted of 231 vocational teachers: 57 teachers worked with media and information sciences and 174 with welfare and health. The data was analyzed using Mann-Whitney U-test, factor analysis and regression analysis. In addition, categorized results were compared with previous study. In this study, the intention to use ICT in teaching was explained by the teachers attitudes and skills and the attitudes of their work community. However, the environment in which ICT was used, i.e., the technical environment, economical resources and time, did not explain the intention. The results did not directly support any of the intention-based models, but they could be interpreted as congruent with the technology acceptance model. The majority of the teachers used ICT at least weekly. They had a strong intention to continue to do that in the future. The study also revealed that there were more teachers who had a critical attitude towards ICT among the teachers of welfare and health. According to the results of this study, it is not possible to state that ICT would not suit any one profession because in every group with teachers with a critical attitude towards ICT there were also teachers with a positive attitude.
Resumo:
Open access is a new model for the publishing of scientific journals enabled by the Internet, in which the published articles are freely available for anyone to read. During the 1990’s hundreds of individual open access journals were founded by groups of academics, supported by grants and unpaid voluntary work. During the last five years other types of open access journals, funded by author charges have started to emerge and also established publishers have started to experiment with different variations of open access. This article reports on the experiences of one open access journal (The Electronic Journal of Information Technology in Construction, ITcon) over its ten year history. In addition to a straightforward account of the lessons learned the journal is also benchmarked against a number of competitors in the same research area and its development is put into the larger perspective of changes in scholarly publishing. The main findings are: That a journal publishing around 20-30 articles per year, equivalent to a typical quarterly journal, can sustainable be produced using an open source like production model. The journal outperforms its competitors in some respects, such as the speed of publication, availability of the results and balanced global distribution of authorship, and is on a par with them in most other respects. The key statistics for ITcon are: Acceptance rate 55 %. Average speed of publication 6-7 months. 801 subscribers to email alerts. Average number of downloads by human readers per paper per month 21.
Resumo:
Spike detection in neural recordings is the initial step in the creation of brain machine interfaces. The Teager energy operator (TEO) treats a spike as an increase in the `local' energy and detects this increase. The performance of TEO in detecting action potential spikes suffers due to its sensitivity to the frequency of spikes in the presence of noise which is present in microelectrode array (MEA) recordings. The multiresolution TEO (mTEO) method overcomes this shortcoming of the TEO by tuning the parameter k to an optimal value m so as to match to frequency of the spike. In this paper, we present an algorithm for the mTEO using the multiresolution structure of wavelets along with inbuilt lowpass filtering of the subband signals. The algorithm is efficient and can be implemented for real-time processing of neural signals for spike detection. The performance of the algorithm is tested on a simulated neural signal with 10 spike templates obtained from [14]. The background noise is modeled as a colored Gaussian random process. Using the noise standard deviation and autocorrelation functions obtained from recorded data, background noise was simulated by an autoregressive (AR(5)) filter. The simulations show a spike detection accuracy of 90%and above with less than 5% false positives at an SNR of 2.35 dB as compared to 80% accuracy and 10% false positives reported [6] on simulated neural signals.
Resumo:
In this paper, we present two new filtered backprojection (FBP) type algorithms for cylindrical detector helical cone-beam geometry with no position dependent backprojection weight. The algorithms are extension of the recent exact Hilbert filtering based 2D divergent beam reconstruction with no backprojection weight to the FDK type algorithm for reconstruction in 3D helical trajectory cone-beam tomography. The two algorithms named HFDK-W1 and HFDK-W2 result in better image quality, noise uniformity, lower noise and reduced cone-beam artifacts.
Resumo:
Non-Gaussianity of signals/noise often results in significant performance degradation for systems, which are designed using the Gaussian assumption. So non-Gaussian signals/noise require a different modelling and processing approach. In this paper, we discuss a new Bayesian estimation technique for non-Gaussian signals corrupted by colored non Gaussian noise. The method is based on using zero mean finite Gaussian Mixture Models (GMMs) for signal and noise. The estimation is done using an adaptive non-causal nonlinear filtering technique. The method involves deriving an estimator in terms of the GMM parameters, which are in turn estimated using the EM algorithm. The proposed filter is of finite length and offers computational feasibility. The simulations show that the proposed method gives a significant improvement compared to the linear filter for a wide variety of noise conditions, including impulsive noise. We also claim that the estimation of signal using the correlation with past and future samples leads to reduced mean squared error as compared to signal estimation based on past samples only.
Resumo:
The increasing use of 3D modeling of Human Face in Face Recognition systems, User Interfaces, Graphics, Gaming and the like has made it an area of active study. Majority of the 3D sensors rely on color coded light projection for 3D estimation. Such systems fail to generate any response in regions covered by Facial Hair (like beard, mustache), and hence generate holes in the model which have to be filled manually later on. We propose the use of wavelet transform based analysis to extract the 3D model of Human Faces from a sinusoidal white light fringe projected image. Our method requires only a single image as input. The method is robust to texture variations on the face due to space-frequency localization property of the wavelet transform. It can generate models to pixel level refinement as the phase is estimated for each pixel by a continuous wavelet transform. In cases of sparse Facial Hair, the shape distortions due to hairs can be filtered out, yielding an estimate for the underlying face. We use a low-pass filtering approach to estimate the face texture from the same image. We demonstrate the method on several Human Faces both with and without Facial Hairs. Unseen views of the face are generated by texture mapping on different rotations of the obtained 3D structure. To the best of our knowledge, this is the first attempt to estimate 3D for Human Faces in presence of Facial hair structures like beard and mustache without generating holes in those areas.
Resumo:
The neural network finds its application in many image denoising applications because of its inherent characteristics such as nonlinear mapping and self-adaptiveness. The design of filters largely depends on the a-priori knowledge about the type of noise. Due to this, standard filters are application and image specific. Widely used filtering algorithms reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high frequency details making the image non-smooth. An integrated general approach to design a finite impulse response filter based on principal component neural network (PCNN) is proposed in this study for image filtering, optimized in the sense of visual inspection and error metric. This algorithm exploits the inter-pixel correlation by iteratively updating the filter coefficients using PCNN. This algorithm performs optimal smoothing of the noisy image by preserving high and low frequency features. Evaluation results show that the proposed filter is robust under various noise distributions. Further, the number of unknown parameters is very few and most of these parameters are adaptively obtained from the processed image.
Resumo:
Image filtering techniques have potential applications in biomedical image processing such as image restoration and image enhancement. The potential of traditional filters largely depends on the apriori knowledge about the type of noise corrupting the image. This makes the standard filters to be application specific. For example, the well-known median filter and its variants can remove the salt-and-pepper (or impulse) noise at low noise levels. Each of these methods has its own advantages and disadvantages. In this paper, we have introduced a new finite impulse response (FIR) filter for image restoration where, the filter undergoes a learning procedure. The filter coefficients are adaptively updated based on correlated Hebbian learning. This algorithm exploits the inter pixel correlation in the form of Hebbian learning and hence performs optimal smoothening of the noisy images. The application of the proposed filter on images corrupted with Gaussian noise, results in restorations which are better in quality compared to those restored by average and Wiener filters. The restored image is found to be visually appealing and artifact-free
Resumo:
The problem of time variant reliability analysis of existing structures subjected to stationary random dynamic excitations is considered. The study assumes that samples of dynamic response of the structure, under the action of external excitations, have been measured at a set of sparse points on the structure. The utilization of these measurements m in updating reliability models, postulated prior to making any measurements, is considered. This is achieved by using dynamic state estimation methods which combine results from Markov process theory and Bayes' theorem. The uncertainties present in measurements as well as in the postulated model for the structural behaviour are accounted for. The samples of external excitations are taken to emanate from known stochastic models and allowance is made for ability (or lack of it) to measure the applied excitations. The future reliability of the structure is modeled using expected structural response conditioned on all the measurements made. This expected response is shown to have a time varying mean and a random component that can be treated as being weakly stationary. For linear systems, an approximate analytical solution for the problem of reliability model updating is obtained by combining theories of discrete Kalman filter and level crossing statistics. For the case of nonlinear systems, the problem is tackled by combining particle filtering strategies with data based extreme value analysis. In all these studies, the governing stochastic differential equations are discretized using the strong forms of Ito-Taylor's discretization schemes. The possibility of using conditional simulation strategies, when applied external actions are measured, is also considered. The proposed procedures are exemplifiedmby considering the reliability analysis of a few low-dimensional dynamical systems based on synthetically generated measurement data. The performance of the procedures developed is also assessed based on a limited amount of pertinent Monte Carlo simulations. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Väärinkäytettyjen aineiden seulontaan käytetyn menetelmän tulee olla herkkä, selektiivinen, yksinkertainen, nopea ja toistettava. Työn tavoitteena oli kehittää yksinkertainen, mutta herkkä, esikäsittelymenetelmä bentsodiatsepiinien ja amfetamiinijohdannaisten kvalitatiiviseen seulomiseen virtsasta mikropilarisähkösumutussirun (μPESI) avulla, mikä tarjoaisi vaihtoehdon seulonnassa käytetyille immunologisille menetelmille, joiden herkkyys ja selektiivisyys ovat puutteellisia. Tavoitteena oli samalla tarkastella mikropilarisähkösumutussirun toimivuutta biologisten näytteiden analyysissa. Esikäsittely optimoitiin erikseen bentsodiatsepiineille ja amfetamiinijohdannaisille. Käytettyjä esikäsittelymenetelmiä olivat neste-nesteuutto, kiinteäfaasiuutto Oasis HLB-patruunalla ja ZipTip®-pipetinkärjellä sekä laimennus ja suodatus ilman uuttoa. Mittausten perusteella keskityttiin optimoimaan ZipTip®-uuttoa. Optimoinnissa tutkittavia yhdisteitä spiikattiin 0-virtsaan niiden ennaltamääritetyn raja-arvon verran, bentsodiatsepiineja 200 ng/ml ja amfetamiinijohdannaisia 300 ng/ml. Bentsodiatsepiinien kohdalla optimoitiin kutakin uuton vaihetta ja optimoinnin tuloksena näytteen pH säädettiin arvoon 5, faasi kunnostettiin asetonitriililla, tasapainotettiin ja pestiin veden (pH 5) ja asetonitriilin (10 % v/v) seoksella ja eluoitiin asetonitriilin, muurahaishapon ja veden (95:1:4 v/v/v) seoksella. Amfetamiinijohdannaisten uutossa optimoitiin näytteen ja liuottimien pH-arvoja ja tuloksena näytteen pH säädettiin arvoon 10, faasi kunnostettiin veden ja ammoniumvetykarbonaatin(pH 10, 1:1 v/v) seoksella, tasapainotettiin ja pestiin asetonitriilin ja veden (1:5 v/v) seoksella ja eluoitiin metanolilla. Optimoituja uuttoja testattiin Yhtyneet Medix Laboratorioista toimitetuilla autenttisilla virtsanäytteillä ja saatuja tuloksia verrattiin kvantitatiivisen GC/MS-analyysin tuloksiin. Bentsodiatsepiininäytteet hydrolysoitiin ennen uuttoa herkkyyden parantamiseksi. Autenttiset näytteet analysoitiin Q-TOF-laitteella Viikissä. Lisäksi hydrolysoidut bentsodiatsepiininäytteet mitattiin Yhtyneet Medix Laboratorioiden TOF-laitteella. Kehitetty menetelmä vaatii tulosten perusteella lisää optimointia toimiakseen. Ongelmana oli etenkin toistoissa ilmennyt tulosten hajonta. Manuaalista näytteensyöttöä tulisi kehittää toistettavammaksi. Autenttisten bentsodiatsepiininäytteiden analyysissa ongelmana olivat virheelliset negatiiviset tulokset ja amfetamiinijohdannaisten analyysissa virheelliset positiiviset tulokset. Virheellisiä negatiivisia tuloksia selittää menetelmän herkkyyden puute ja virheellisiä positiivisia tuloksia mittalaitteen, sirujen tai liuottimien likaantuminen.
Resumo:
The near flow field of small aspect ratio elliptic turbulent free jets (issuing from nozzle and orifice) was experimentally studied using a 2D PIV. Two point velocity correlations in these jets revealed the extent and orientation of the large scale structures in the major and minor planes. The spatial filtering of the instantaneous velocity field using Gaussian convolution kernel shows that while a single large vortex ring circumscribing the jet seems to be present at the exit of nozzle, the orifice jet exhibited a number of smaller vortex ring pairs close to jet exit. The smaller length scale observed in the case of the orifice jet is representative of the smaller azimuthal vortex rings that generate axial vortex field as they are convected. This results in the axis-switching in the case of orifice jet and may have a mechanism different from the self induction process as observed in the case of contoured nozzle jet flow.
Resumo:
The new paradigm of connectedness and empowerment brought by the interactivity feature of the Web 2.0 has been challenging the traditional centralized performance of mainstream media. The corporation has been able to survive the strong winds by transforming itself into a global multimedia business network embedded in the network society. By establishing networks, e.g. networks of production and distribution, the global multimedia business network has been able to sight potential solutions by opening the doors to innovation in a decentralized and flexible manner. Under this emerging context of re-organization, traditional practices like sourcing need to be re- explained and that is precisely what this thesis attempts to tackle. Based on ICT and on the network society, the study seeks to explain within the Finnish context the particular case of Helsingin Sanomat (HS) and its relations with the youth news agency, Youth Voice Editorial Board (NÄT). In that sense, the study can be regarded as an explanatory embedded single case study, where HS is the principal unit of analysis and NÄT its embedded unit of analysis. The thesis was able to reach explanations through interrelated steps. First, it determined the role of ICT in HS’s sourcing practices. Then it mapped an overview of the HS’s sourcing relations and provided a context in which NÄT was located. And finally, it established conceptualized institutional relational data between HS and NÄT for their posterior measurement through social network analysis. The data set was collected via qualitative interviews addressed to online and offline editors of HS as well as interviews addressed to NÄT’s personnel. The study concluded that ICT’s interactivity and User Generated Content (UGC) are not sourcing tools as such but mechanism used by HS for getting ideas that could turn into potential news stories. However, when it comes to visual communication, some exemptions were found. The lack of official sources amidst the immediacy leads HS to rely on ICT’s interaction and UGC. More than meets the eye, ICT’s input into the sourcing practice may be more noticeable if the interaction and UGC is well organized and coordinated into proper and innovative networks of alternative content collaboration. Currently, HS performs this sourcing practice via two projects that differ, precisely, by the mode they are coordinated. The first project found, Omakaupunki, is coordinated internally by Sanoma Group’s owned media houses HS, Vartti and Metro. The second project found is coordinated externally. The external alternative sourcing network, as it was labeled, consists of three actors, namely HS, NÄT (professionals in charge) and the youth. This network is a balanced and complete triad in which the actors connect themselves in relations of feedback, recognition, creativity and filtering. However, as innovation is approached very reluctantly, this content collaboration is a laboratory of experiments; a ‘COLLABORATORY’.
Resumo:
Homomorphic analysis and pole-zero modeling of electrocardiogram (ECG) signals are presented in this paper. Four typical ECG signals are considered and deconvolved into their minimum and maximum phase components through cepstral filtering, with a view to study the possibility of more efficient feature selection from the component signals for diagnostic purposes. The complex cepstra of the signals are linearly filtered to extract the basic wavelet and the excitation function. The ECG signals are, in general, mixed phase and hence, exponential weighting is done to aid deconvolution of the signals. The basic wavelet for normal ECG approximates the action potential of the muscle fiber of the heart and the excitation function corresponds to the excitation pattern of the heart muscles during a cardiac cycle. The ECG signals and their components are pole-zero modeled and the pole-zero pattern of the models can give a clue to classify the normal and abnormal signals. Besides, storing only the parameters of the model can result in a data reduction of more than 3:1 for normal signals sampled at a moderate 128 samples/s
Resumo:
The open development model of software production has been characterized as the future model of knowledge production and distributed work. Open development model refers to publicly available source code ensured by an open source license, and the extensive and varied distributed participation of volunteers enabled by the Internet. Contemporary spokesmen of open source communities and academics view open source development as a new form of volunteer work activity characterized by hacker ethic and bazaar governance . The development of the Linux operating system is perhaps the best know example of such an open source project. It started as an effort by a user-developer and grew quickly into a large project with hundreds of user-developer as contributors. However, in hybrids , in which firms participate in open source projects oriented towards end-users, it seems that most users do not write code. The OpenOffice.org project, initiated by Sun Microsystems, in this study represents such a project. In addition, the Finnish public sector ICT decision-making concerning open source use is studied. The purpose is to explore the assumptions, theories and myths related to the open development model by analysing the discursive construction of the OpenOffice.org community: its developers, users and management. The qualitative study aims at shedding light on the dynamics and challenges of community construction and maintenance, and related power relations in hybrid open source, by asking two main research questions: How is the structure and membership constellation of the community, specifically the relation between developers and users linguistically constructed in hybrid open development? What characterizes Internet-mediated virtual communities and how can they be defined? How do they differ from hierarchical forms of knowledge production on one hand and from traditional volunteer communities on the other? The study utilizes sociological, psychological and anthropological concepts of community for understanding the connection between the real and the imaginary in so-called virtual open source communities. Intermediary methodological and analytical concepts are borrowed from discourse and rhetorical theories. A discursive-rhetorical approach is offered as a methodological toolkit for studying texts and writing in Internet communities. The empirical chapters approach the problem of community and its membership from four complementary points of views. The data comprises mailing list discussion, personal interviews, web page writings, email exchanges, field notes and other historical documents. The four viewpoints are: 1) the community as conceived by volunteers 2) the individual contributor s attachment to the project 3) public sector organizations as users of open source 4) the community as articulated by the community manager. I arrive at four conclusions concerning my empirical studies (1-4) and two general conclusions (5-6). 1) Sun Microsystems and OpenOffice.org Groupware volunteers failed in developing necessary and sufficient open code and open dialogue to ensure collaboration thus splitting the Groupware community into volunteers we and the firm them . 2) Instead of separating intrinsic and extrinsic motivations, I find that volunteers unique patterns of motivations are tied to changing objects and personal histories prior and during participation in the OpenOffice.org Lingucomponent project. Rather than seeing volunteers as a unified community, they can be better understood as independent entrepreneurs in search of a collaborative community . The boundaries between work and hobby are blurred and shifting, thus questioning the usefulness of the concept of volunteer . 3) The public sector ICT discourse portrays a dilemma and tension between the freedom to choose, use and develop one s desktop in the spirit of open source on one hand and the striving for better desktop control and maintenance by IT staff and user advocates, on the other. The link between the global OpenOffice.org community and the local end-user practices are weak and mediated by the problematic IT staff-(end)user relationship. 4) Authoring community can be seen as a new hybrid open source community-type of managerial practice. The ambiguous concept of community is a powerful strategic tool for orienting towards multiple real and imaginary audiences as evidenced in the global membership rhetoric. 5) The changing and contradictory discourses of this study show a change in the conceptual system and developer-user relationship of the open development model. This change is characterized as a movement from hacker ethic and bazaar governance to more professionally and strategically regulated community. 6) Community is simultaneously real and imagined, and can be characterized as a runaway community . Discursive-action can be seen as a specific type of online open source engagement. Hierarchies and structures are created through discursive acts. Key words: Open Source Software, open development model, community, motivation, discourse, rhetoric, developer, user, end-user
Resumo:
In a number of applications of computerized tomography, the ultimate goal is to detect and characterize objects within a cross section. Detection of edges of different contrast regions yields the required information. The problem of detecting edges from projection data is addressed. It is shown that the class of linear edge detection operators used on images can be used for detection of edges directly from projection data. This not only reduces the computational burden but also avoids the difficulties of postprocessing a reconstructed image. This is accomplished by a convolution backprojection operation. For example, with the Marr-Hildreth edge detection operator, the filtering function that is to be used on the projection data is the Radon transform of the Laplacian of the 2-D Gaussian function which is combined with the reconstruction filter. Simulation results showing the efficacy of the proposed method and a comparison with edges detected from the reconstructed image are presented