925 resultados para Generated regressor
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
The preparation of a series of nickel dichloride complexes with bulky diphosphinomethane chelate ligands R2PCH2PR′2 is reported. Reaction with the appropriate Grignard reagent leads to the corresponding dimethyl and dibenzyl complexes. Cationic monomethyl and mono-η3-benzyl complexes are generated from these dialkyl complexes by protonation with [H(OEt2)2]+[B(3,5-(CF3)2C6H3)4]−, while the complex [(dtbpm κ2P)Ni(η3-CH(CH2Ph)Ph]+[B(3,5-(CF3)2C6H3)4]−is obtained from protonation of the Ni(0) olefin complex (dtbpm-κ2P)N(η2-trans-stilbene). Crystal structures of examples of dichlorides, dimethyl, dibenzyl, cationic methyl, and cationic η3-benzyl complexes are reported. Solutions of the cations polymerize ethylene under mild conditions and without the necessity of an activating agent, to form polyethylene having high molecular weights and low degrees of chain branching. In comparison to the Ni methyl cations, the η3-benzyl cation complexes are more stable and somewhat less active but still very efficient in C2H4 polymerization. The effect on the resulting polyethylene of varying the substituents R, R′ on the phosphine ligand has been examined, and a clear trend for longer chain PE with less branching in the presence of more bulky substituents on the diphosphine has been found. Density functional calculations have been used to examine the rapid suprafacial η3 to η3 haptotropic shift processes of the[(R2PCH2PR′2)Ni] fragment and the η3−η1 change of the coordination mode of the benzyl group required for polymerization in those cations.
Resumo:
“Turtle Twilight” is a two-screen video installation. Paragraphs of text adapted from a travel blog type across the left-hand screen. A computer-generated image of a tropical sunset is slowly animated on the right-hand screen. The two screens are accompanied by an atmospheric stock music track. This work examines how we construct, represent and deploy ‘nature’ in our contemporary lives. It mixes cinematic codes with image, text and sound gleaned from online sources. By extending on Nicolas Bourriad’s understanding of ‘postproduction’ and the creative and critical strategies of ‘editing’, it questions the relationship between contemporary screen culture, nature, desire and contemplation.
Resumo:
This thesis examines consumer initiated value co-creation behaviour in the context of convergent mobile online services using a Service-Dominant logic (SD logic) theoretical framework. It focuses on non-reciprocal marketing phenomena such as open innovation and user generated content whereby new viable business models are derived and consumer roles and community become essential to the success of business. Attention to customers. roles and personalised experiences in value co-creation has been recognised in the literature (e.g., Prahalad & Ramaswamy, 2000; Prahalad, 2004; Prahalad & Ramaswamy, 2004). Similarly, in a subsequent iteration of their 2004 version of the foundations of SD logic, Vargo and Lusch (2006) replaced the concept of value co-production with value co-creation and suggested that a value co-creation mindset is essential to underpin the firm-customer value creation relationship. Much of this focus, however, has been limited to firm initiated value co-creation (e.g., B2B or B2C), while consumer initiated value creation, particularly consumer-to-consumer (C2C) has received little attention in the SD logic literature. While it is recognised that not every consumer wishes to make the effort to engage extensively in co-creation processes (MacDonald & Uncles, 2009), some consumers may not be satisfied with a standard product, instead they engage in the effort required for personalisation that potentially leads to greater value for themselves, and which may benefit not only the firm, but other consumers as well. Literature suggests that there are consumers who do, and as a result initiate such behaviour and expend effort to engage in co-creation activity (e.g., Gruen, Osmonbekov and Czaplewski, 2006; 2007 MacDonald & Uncles, 2009). In terms of consumers. engagement in value proposition (co-production) and value actualisation (co-creation), SD logic (Vargo & Lusch, 2004, 2008) provides a new lens that enables marketing scholars to transcend existing marketing theory and facilitates marketing practitioners to initiate service centric and value co-creation oriented marketing practices. Although the active role of the consumer is acknowledged in the SD logic oriented literature, we know little about how and why consumers participate in a value co-creation process (Payne, Storbacka, & Frow, 2008). Literature suggests that researchers should focus on areas such as C2C interaction (Gummesson 2007; Nicholls 2010) and consumer experience sharing and co-creation (Belk 2009; Prahalad & Ramaswamy 2004). In particular, this thesis seeks to better understand consumer initiated value co-creation, which is aligned with the notion that consumers can be resource integrators (Baron & Harris, 2008) and more. The reason for this focus is that consumers today are more empowered in both online and offline contexts (Füller, Mühlbacher, Matzler, & Jawecki, 2009; Sweeney, 2007). Active consumers take initiatives to engage and co-create solutions with other active actors in the market for their betterment of life (Ballantyne & Varey, 2006; Grönroos & Ravald, 2009). In terms of the organisation of the thesis, this thesis first takes a „zoom-out. (Vargo & Lusch, 2011) approach and develops the Experience Co-Creation (ECo) framework that is aligned with balanced centricity (Gummesson, 2008) and Actor-to-Actor worldview (Vargo & Lusch, 2011). This ECo framework is based on an extended „SD logic friendly lexicon. (Lusch & Vargo, 2006): value initiation and value initiator, value-in-experience, betterment centricity and betterment outcomes, and experience co-creation contexts derived from five gaps identified from the SD logic literature review. The framework is also designed to accommodate broader marketing phenomena (i.e., both reciprocal and non-reciprocal marketing phenomena). After zooming out and establishing the ECo framework, the thesis takes a zoom-in approach and places attention back on the value co-creation process. Owing to the scope of the current research, this thesis focuses specifically on non-reciprocal value co-creation phenomena initiated by consumers in online communities. Two emergent concepts: User Experience Sharing (UES) and Co-Creative Consumers are proposed grounded in the ECo framework. Together, these two theorised concepts shed light on the following two propositions: (1) User Experience Sharing derives value-in-experience as consumers make initiative efforts to participate in value co-creation, and (2) Co-Creative Consumers are value initiators who perform UES. Three research questions were identified underpinning the scope of this research: RQ1: What factors influence consumers to exhibit User Experience Sharing behaviour? RQ2: Why do Co-Creative Consumers participate in User Experience Sharing as part of value co-creation behaviour? RQ3: What are the characteristics of Co-Creative Consumers? To answer these research questions, two theoretical models were developed: the User Experience Sharing Behaviour Model (UESBM) grounded in the Theory of Planned Behaviour framework, and the Co-Creative Consumer Motivation Model (CCMM) grounded in the Motivation, Opportunity, Ability framework. The models use SD logic consistent constructs and draw upon multiple streams of literature including consumer education, consumer psychology and consumer behaviour, and organisational psychology and organisational behaviour. These constructs include User Experience Sharing with Other Consumers (UESC), User Experience Sharing with Firms (UESF), Enjoyment in Helping Others (EIHO), Consumer Empowerment (EMP), Consumer Competence (COMP), and Intention to Engage in User Experience Sharing (INT), Attitudes toward User Experience Sharing (ATT) and Subjective Norm (SN) in the UESBM, and User Experience Sharing (UES), Consumer Citizenship (CIT), Relating Needs of Self (RELS) and Relating Needs of Others (RELO), Newness (NEW), Mavenism (MAV), Use Innovativeness (UI), Personal Initiative (PIN) and Communality (COMU) in the CCMM. Many of these constructs are relatively new to marketing and require further empirical evidence for support. Two studies were conducted to underpin the corresponding research questions. Study One was conducted to calibrate and re-specify the proposed models. Study Two was a replica study to confirm the proposed models. In Study One, data were collected from a PC DIY online community. In Study Two, a majority of data were collected from Apple product online communities. The data were examined using structural equation modelling and cluster analysis. Considering the nature of the forums, the Study One data is considered to reflect some characteristics of Prosumers and the Study Two data is considered to reflect some characteristics of Innovators. The results drawn from two independent samples (N = 326 and N = 294) provide empirical support for the overall structure theorised in the research models. The results in both models show that Enjoyment in Helping Others and Consumer Competence in the UESBM, and Consumer Citizenship and Relating Needs in CCMM have significant impacts on UES. The consistent results appeared in both Study One and Study Two. The results also support the conceptualisation of Co-Creative Consumers and indicate Co-Creative Consumers are individuals who are able to relate the needs of themselves and others and feel a responsibility to share their valuable personal experiences. In general, the results shed light on "How and why consumers voluntarily participate in the value co-creation process?. The findings provide evidence to conceptualise User Experience Sharing behaviour as well as the Co-Creative Consumer using the lens of SD logic. This research is a pioneering study that incorporates and empirically tests SD logic consistent constructs to examine a particular area of the logic – that is consumer initiated value co-creation behaviour. This thesis also informs practitioners about how to facilitate and understand factors that engage with either firm or consumer initiated online communities.
Resumo:
Diminished student interest in science, technology, engineering and mathematics (STEM) is recognised by educators, researchers and public policy makers as a concerning global trend. Inviting stakeholders like scientists and industry specialists to discuss their work is one means schools use to facilitate student engagement in the sciences. However, these visits generally comprise one-off sessions with minimal relevance to students’ particular and ongoing learning needs. This case study investigated coteaching and cogenerative dialoguing with parents in teaching a Year-8 multidisciplinary unit with science and technology foci. Two parents cotaught alongside the resident teacher and researcher over eight months. This paper concentrates on one parent, a medical scientist by profession. Data sources included video and audio recordings of cogenerative dialogues and classroom interactions, student work samples and journal entries. Data were interrogated using the sociological constructs of fields and capitals and the dialectic of structure|agency. The findings reveal how (a) the parent’s science and technology knowledge was tailored to the students’ needs initially and continually and (b) student-generated data indicated enhanced engagement in science and technology. The research speaks to schools and governments about enhancing STEM education by furthering collaborative relationships with relevant stakeholders.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
Nowadays, everyone can effortlessly access a range of information on the World Wide Web (WWW). As information resources on the web continue to grow tremendously, it becomes progressively more difficult to meet high expectations of users and find relevant information. Although existing search engine technologies can find valuable information, however, they suffer from the problems of information overload and information mismatch. This paper presents a hybrid Web Information Retrieval approach allowing personalised search using ontology, user profile and collaborative filtering. This approach finds the context of user query with least user’s involvement, using ontology. Simultaneously, this approach uses time-based automatic user profile updating with user’s changing behaviour. Subsequently, this approach uses recommendations from similar users using collaborative filtering technique. The proposed method is evaluated with the FIRE 2010 dataset and manually generated dataset. Empirical analysis reveals that Precision, Recall and F-Score of most of the queries for many users are improved with proposed method.
Resumo:
Local communities are vulnerable to the potential environmental risks associated with construction activity. Currently, little is understood about how perceptions of environmental risks are shaped and spread within a community. A better understanding of this process can help bridge the gap between developers and communities and bring about more sustainable development practices. This paper reports a research methodology which uses social contagion theory to investigate this process. The research adopts a single case study approach of a highly controversial housing project in the greater Sydney metropolitan area. The case study is particularly significant as it investigates an extensive and on-going community-based protest campaign (dating back almost 20 years) that has generated the longest standing 24 hour community picket in the New South Wales.
Resumo:
The multiple banded antigen (MBA) is a predicted virulence factor of Ureaplasma species. Antigenic variation of the MBA is a potential mechanism by which ureaplasmas avoid immune recognition and cause chronic infections of the upper genital tract of pregnant women. We tested whether the MBA is involved in the pathogenesis of intra-amniotic infection and chorioamnionitis by injecting virulent or avirulent-derived ureaplasma clones (expressing single MBA variants) into the amniotic fluid of pregnant sheep. At 55 days of gestation pregnant ewes (n = 20) received intra-amniotic injections of virulent-derived or avirulent-derived U. parvum serovar 6 strains (2×104 CFU), or 10B medium (n = 5). Amniotic fluid was collected every two weeks post-infection and fetal tissues were collected at the time of surgical delivery of the fetus (140 days of gestation). Whilst chronic colonisation was established in the amniotic fluid of animals infected with avirulent-derived and virulent-derived ureaplasmas, the severity of chorioamnionitis and fetal inflammation was not different between these groups (p>0.05). MBA size variants (32–170 kDa) were generated in vivo in amniotic fluid samples from both the avirulent and virulent groups, whereas in vitro antibody selection experiments led to the emergence of MBA-negative escape variants in both strains. Anti-ureaplasma IgG antibodies were detected in the maternal serum of animals from the avirulent (40%) and virulent (55%) groups, and these antibodies correlated with increased IL-1β, IL-6 and IL-8 expression in chorioamnion tissue (p<0.05). We demonstrate that ureaplasmas are capable of MBA phase variation in vitro; however, ureaplasmas undergo MBA size variation in vivo, to potentially prevent eradication by the immune response. Size variation of the MBA did not correlate with the severity of chorioamnionitis. Nonetheless, the correlation between a maternal humoral response and the expression of chorioamnion cytokines is a novel finding. This host response may be important in the pathogenesis of inflammation-mediated adverse pregnancy outcomes.
Resumo:
This paper presents a strategy for delayed research method selection in a qualitative interpretivist research. An exemplary case details how explorative interviews were designed and conducted in accordance with a paradigm prior to deciding whether to adopt grounded theory or phenomenology for data analysis. The focus here is to determine the most appropriate research strategy in this case the methodological framing to conduct research and represent findings, both of which are detailed. Research addressing current management issues requires both a flexible framework and the capability to consider the research problem from various angles, to derive tangible results for academia with immediate application to business demands. Researchers, and in particular novices, often struggle to decide on an appropriate research method suitable to address their research problem. This often applies to interpretative qualitative research where it is not always immediately clear which is the most appropriate method to use, as the research objectives shift and crystallize over time. This paper uses an exemplary case to reveal how the strategy for delayed research method selection contributes to deciding whether to adopt grounded theory or phenomenology in the initial phase of a PhD research project. In this case, semi-structured interviews were used for data generation framed in an interpretivist approach, situated in a business context. Research questions for this study were thoroughly defined and carefully framed in accordance with the research paradigm‟s principles, while at the same time ensuring that the requirements of both potential research methods were met. The grounded theory and phenomenology methods were compared and contrasted to determine their suitability and whether they meet the research objectives based on a pilot study. The strategy proposed in this paper is an alternative to the more „traditional‟ approach, which initially selects the methodological formulation, followed by data generation. In conclusion, the suggested strategy for delayed research method selection intends to help researchers identify and apply the most appropriate method to their research. This strategy is based on explorations of data generation and analysis in order to derive faithful results from the data generated.
Resumo:
Given the substantial investment in information technology (IT), and the significant impact IT has on organizational success, organizations consume considerable resources to manage acquisition and use of their IT resources. While various arguments proposed suggest which IT governance arrangements may work best, our understanding of the effectiveness of such initiatives is limited. We examine the relationship between the effectiveness of IT steering committee driven IT governance initiatives and firm's IT management and IT infrastructure related capabilities. We further propose that firm's ITrelated capabilities generated through IT governance initiatives should improve its business processes and firm-level performance. We test these relationships empirically by a field survey. Results suggest that firms' effectiveness of IT steering committee driven IT governance initiatives positively relates to the level of their IT-related capabilities. We also found positive relationships between IT-related capabilities and internal process-level performance. Our results also support that improvement in internal process-level performance positively relates to improvement in customer service and firm-level performance.
Resumo:
In this paper, three metaheuristics are proposed for solving a class of job shop, open shop, and mixed shop scheduling problems. We evaluate the performance of the proposed algorithms by means of a set of Lawrence’s benchmark instances for the job shop problem, a set of randomly generated instances for the open shop problem, and a combined job shop and open shop test data for the mixed shop problem. The computational results show that the proposed algorithms perform extremely well on all these three types of shop scheduling problems. The results also reveal that the mixed shop problem is relatively easier to solve than the job shop problem due to the fact that the scheduling procedure becomes more flexible by the inclusion of more open shop jobs in the mixed shop.
Resumo:
Chronic venous leg ulcers are a detrimental health issue plaguing our society, resulting in long term pain, immobility and decreased quality of life for a large proportion of sufferers. The frequency of these chronic wounds has led current research to focus on the wound environment to provide important information regarding the prolonged, fluctuated or static healing patterns of these wounds. Disruption to the normal wound healing process results in release of multiple factors in the wound environment that could correlate to wound chronicity. These biochemical factors can often be detected through non-invasively sampling chronic wound fluid (CWF) from the site of injury. Of note, whilst there are numerous studies comparing acute and chronic wound fluids, there have not been any reports in the literature employing a longitudinal study in order to track biochemical changes in wound fluid as patients transition from a non-healing to healed state. Initially the objective of this study was to identify biochemical changes in CWF associated with wound healing using a proteomic approach. The proteomic approach incorporated a multi-dimensional liquid chromatography fractionation technique coupled with mass spectrometry (MS) to enable identification of proteins present in lower concentrations in CWF. Not surprisingly, many of the proteins identified in wound fluid were acute phase proteins normally expressed during the inflammatory phase of healing. However, the number of proteins positively identified by MS was quite low. This was attributed to the diverse range in concentration of protein species in CWF making it challenging to detect the diagnostically relevant low molecular weight proteins. In view of this, SELDI-TOF MS was also explored as a means to target low molecular weight proteins in sequential patient CWF samples during the course of healing. Unfortunately, the results generated did not yield any peaks of interest that were altered as wounds transitioned to a healed state. During the course of proteomic assessment of CWF, it became evident that a fraction of non-proteinaceous compounds strongly absorbed at 280 nm. Subsequent analyses confirmed that most of these compounds were in fact part of the purine catabolic pathway, possessing distinctive aromatic rings and which results in high absorbance at 254 nm. The accumulation of these purinogenic compounds in CWF suggests that the wound bed is poorly oxygenated resulting in a switch to anaerobic metabolism and consequently ATP breakdown. In addition, the presence of the terminal purine catabolite, uric acid (UA), indicates that the enzyme xanthine oxidoreductase (XOR) catalyses the reaction of hypoxanthine to xanthine and finally to UA. More importantly, the studies provide evidence for the first time of the exogenous presence of XOR in CWF. XOR is the only enzyme in humans capable of catalysing the production of UA in conjunction with a burst of the highly reactive superoxide radical and other oxidants like H2O2. Excessive release of these free radicals in the wound environment can cause cellular damage disrupting the normal wound healing process. In view of this, a sensitive and specific assay was established for monitoring low concentrations of these catabolites in CWF. This procedure involved combining high performance liquid chromatography (HPLC) with tandem mass spectrometry and multiple reaction monitoring (MRM). This application was selective, using specific MRM transitions and HPLC separations for each analyte, making it ideal for the detection and quantitation of purine catabolites in CWF. The results demonstrated that elevated levels of UA were detected in wound fluid obtained from patients with clinically worse ulcers. This suggests that XOR is active in the wound site generating significant amounts of reactive oxygen species (ROS). In addition, analysis of the amount of purine precursors in wound fluid revealed elevated levels of purine precursors in wound fluid from patients with less severe ulcers. Taken together, the results generated in this thesis suggest that monitoring changes of purine catabolites in CWF is likely to provide valuable information regarding the healing patterns of chronic venous leg ulcers. XOR catalysis of purine precursors not only provides a method for monitoring the onset, prognosis and progress of chronic venous leg ulcers, but also provides a potential therapeutic target by inhibiting XOR, thus blocking UA and ROS production. Targeting a combination of these purinogenic compounds and XOR could lead to the development of novel point of care diagnostic tests. Therefore, further investigation of these processes during wound healing will be worthwhile and may assist in elucidating the pathogenesis of this disease state, which in turn may lead to the development of new diagnostics and therapies that target these processes.
Resumo:
In recent years, various observers have pointed to the shifting paradigms of cultural and societal participation and economic production in developed nations. These changes are facilitated (although, importantly, not solely driven) by the emergence of new, participatory technologies of information access, knowledge exchange, and content production, many of whom are associated with Internet and new media technologies. In an online context, such technologies are now frequently described as social software, social media, or Web2.0, but their impact is no longer confined to cyberspace as an environment that is somehow different and separate from ‘real life’: user-led content and knowledge production is increasingly impacting on media, economy, law, social practices, and democracy itself.
Resumo:
The concept of produsage developed from the realisation that new language was needed to describe the new phenomena emerging from the intersection of Web 2.0, user-generated content, and social media since the early years of the new millennium. When hundreds, thousands, maybe tens of thousands of participants utilise online platforms to collaborate in the development and continuous improvement of a wide variety of content – from software to informational resources to creative works –, and when this work takes place through a series of more or less unplanned, ad hoc, almost random cooperative encounters, then to describe these processes using terms which were developed during the industrial revolution no longer makes much sense. When – exactly because what takes place here is no longer a form of production in any conventional sense of the word – the outcomes of these massively distributed collaborations appear in the form of constantly changing, permanently mutable bodies of work which are owned at once by everyone and no-one, by the community of contributors as a whole but by none of them as individuals, then to conceptualise them as fixed and complete products in the industrial meaning of the term is missing the point. When what results from these efforts is of a quality (in both depth and breadth) that enables it to substitute for, replace, and even undermine the business model of long-established industrial products, even though precariously it relies on volunteer contributions, and when their volunteering efforts make it possible for some contributors to find semi- or fully professional employment in their field, then conventional industrial logic is put on its head.