41 resultados para 230101 Mathematical Logic, Set Theory, Lattices And Combinatorics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The competition between Photoinduced electron transfer (PET) and other de-excitation pathways such as fluorescence and phosphorescence can be controlled within designed molecular structures. Depending on the particular design, the resulting optical output is thus a function of various inputs such as ion concentration and excitation light dose. Once digitized into binary code, these input-output patterns can be interpreted according to Boolean logic. The single-input logic types of YES and NOT cover simple sensors and the double- (or higher-) input logic types represent other gates such as AND and OR. The logic-based arithmetic processors such as half-adders and half-subtractors are also featured. Naturally, a principal application of the more complex gates is in multi-sensing contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We define and prove the existence of free Banach lattices in the category of Banach lattices and contractive lattice homomorphisms, and establish some of their fundamental properties. We give much more detailed results about their structure in the case when there are only a finite number of generators, and give several Banach lattice characterizations of the number of generators being, respectively, one, finite or countable. We define a Banach lattice P to be projective if, whenever X is a Banach lattice, J is a closed ideal in X, Q : X → X/J is the quotient map, T: P → X/J is a linear lattice homomorphism and ε > 0, there exists a linear lattice homomorphism : P → X such thatT = Q º and ∥∥ ≤ (1 + ε)∥T∥. We establish the connection between projective Banach lattices and free Banach lattices, describe several families of Banach lattices that are projective and prove that some are not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Looking at one site, the Israeli checkpoints in the occupied Palestinian territory, this article seeks to understand the mechanisms by which violence can present itself as justifiable (or justified), even when it materializes within frames presumably set to annul it. We look at the checkpoints as a condensed microcosmos operating within two such frames. One is the prolonged IsraeliPalestinian ‘peace process’ (the checkpoints became a primary technology of control in the period following the beginning of the peace process), and the other is regulatory power (disciplinary and biopower), which in the Foucauldian framework presumably sidelines the violent form which sovereign power takes. We argue that the checkpoints, which dissect the Palestinian occupied territories into dozens of enclaves and which are one of the most effective and destructive means of control within the current stage of occupation, can be seen as more than obstacles in the way of Palestinian movement; we suggest that they also function as corrective technologies that are meant to fail. It is with this failure that violence can appear as justified. In order to show the operation of this embedded failure, we examine one mechanism operating within the checkpoints: ‘the imaginary line’. The imaginary line is both a component within, and an emblem of a mode of control that constantly undoes itself in order to summon violence. Since it is never visibly marked in the physical space, the imaginary line is bound to be unintentionally crossed, thereby randomly rendering Palestinians as ‘transgressors’ of the rule and thus facilitating eruptions of violence by the soldiers stationed at the checkpoints. This article proposes an analysis of this hidden demarcation of space in order to question the different relations between subjects and power which it both assumes and constitutes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Queer politics and spaces have historically been associated with ideals of sexual liberation. They are conceptualised as spaces where sex, and its intersections with intimacy, friendship and love can be explored outside of normative frameworks which value monogamous reproductive heterosexuality at the expense of other non-normative sexual expressions. In recent years, however, autonomous queer spaces such as the global Queeruption gatherings and other queer community spaces in Australia have become increasingly concerned with the presence and danger of sexual violence in queer communities. Almost without exception, this danger has been responded to through the creation of ‘safe(r) spaces’ policies, generally consisting of a set of guidelines and proscribed behaviours which individuals must agree to in order to participate in or attend the event or space. The guidelines themselves tend to privilege of sexual politics of affirmative verbal consent, insisting that such consent should be sought prior to any physical or sexual contact, inferring that a failure to do so is ethically unacceptable within. This chapter reflects on the attempts to construct queer communities as ‘safer spaces,’ arguing that the concepts of consent and safety are inadequate to develop a queer response to sexual violence. Such a response, it argues, must be based on the openness to possibilities and refusal of sexual restrictions and regulations that have always been central elements of queer theory and politics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper offers a critical reflection upon the use of a grounded theory approach within a doctoral study. As well as providing an outline of grounded theory, it begins by noting the existence of some powerful critiques of a grounded theory approach, in particular around the key concepts of ‘theory’, ‘discovery’ and ‘ground’. It is argued that, in some cases, grounded theory struggles to counter these challenges, especially in its ‘purist’ forms. However, with reference to research carried out as part of a PhD study of sharing education in Northern Ireland which employed a grounded theory approach, a case is made for an open and critical grounded theory based upon three principles: pragmatism; research as practice; and reflexivity. It is concluded that a reasonable case can be made for grounded theory where: grounded theory researchers maintain a balance between belonging to and critique of the grounded theory community; where there is an emphasis upon theorizing rather than the discovery of theory; and where the strengths of grounded theory as 'practice' and 'craft' are maximised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cells experience damage from exogenous and endogenous sources that endanger genome stability. Several cellular pathways have evolved to detect DNA damage and mediate its repair. Although many proteins have been implicated in these processes, only recent studies have revealed how they operate in the context of high-ordered chromatin structure. Here, we identify the nuclear oncogene SET (I2PP2A) as a modulator of DNA damage response (DDR) and repair in chromatin surrounding double-strand breaks (DSBs). We demonstrate that depletion of SET increases DDR and survival in the presence of radiomimetic drugs, while overexpression of SET impairs DDR and homologous recombination (HR)-mediated DNA repair. SET interacts with the Kruppel-associated box (KRAB)-associated co-repressor KAP1, and its overexpression results in the sustained retention of KAP1 and Heterochromatin protein 1 (HP1) on chromatin. Our results are consistent with a model in which SET-mediated chromatin compaction triggers an inhibition of DNA end resection and HR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Possibilistic answer set programming (PASP) unites answer set programming (ASP) and possibilistic logic (PL) by associating certainty values with rules. The resulting framework allows to combine both non-monotonic reasoning and reasoning under uncertainty in a single framework. While PASP has been well-studied for possibilistic definite and possibilistic normal programs, we argue that the current semantics of possibilistic disjunctive programs are not entirely satisfactory. The problem is twofold. First, the treatment of negation-as-failure in existing approaches follows an all-or-nothing scheme that is hard to match with the graded notion of proof underlying PASP. Second, we advocate that the notion of disjunction can be interpreted in several ways. In particular, in addition to the view of ordinary ASP where disjunctions are used to induce a non-deterministic choice, the possibilistic setting naturally leads to a more epistemic view of disjunction. In this paper, we propose a semantics for possibilistic disjunctive programs, discussing both views on disjunction. Extending our earlier work, we interpret such programs as sets of constraints on possibility distributions, whose least specific solutions correspond to answer sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seafloor massive sulfide (SMS) mining will likely occur at hydrothermal systems in the near future. Alongside their mineral wealth, SMS deposits also have considerable biological value. Active SMS deposits host endemic hydrothermal vent communities, whilst inactive deposits support communities of deep water corals and other suspension feeders. Mining activities are expected to remove all large organisms and suitable habitat in the immediate area, making vent endemic organisms particularly at risk from habitat loss and localised extinction. As part of environmental management strategies designed to mitigate the effects of mining, areas of seabed need to be protected to preserve biodiversity that is lost at the mine site and to preserve communities that support connectivity among populations of vent animals in the surrounding region. These "set-aside" areas need to be biologically similar to the mine site and be suitably connected, mostly by transport of larvae, to neighbouring sites to ensure exchange of genetic material among remaining populations. Establishing suitable set-asides can be a formidable task for environmental managers, however the application of genetic approaches can aid set-aside identification, suitability assessment and monitoring. There are many genetic tools available, including analysis of mitochondrial DNA (mtDNA) sequences (e.g. COI or other suitable mtDNA genes) and appropriate nuclear DNA markers (e.g. microsatellites, single nucleotide polymorphisms), environmental DNA (eDNA) techniques and microbial metagenomics. When used in concert with traditional biological survey techniques, these tools can help to identify species, assess the genetic connectivity among populations and assess the diversity of communities. How these techniques can be applied to set-aside decision making is discussed and recommendations are made for the genetic characteristics of set-aside sites. A checklist for environmental regulators forms a guide to aid decision making on the suitability of set-aside design and assessment using genetic tools. This non-technical primer document represents the views of participants in the VentBase 2014 workshop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The viscosity of ionic liquids (ILs) has been modeled as a function of temperature and at atmospheric pressure using a new method based on the UNIFAC–VISCO method. This model extends the calculations previously reported by our group (see Zhao et al. J. Chem. Eng. Data 2016, 61, 2160–2169) which used 154 experimental viscosity data points of 25 ionic liquids for regression of a set of binary interaction parameters and ion Vogel–Fulcher–Tammann (VFT) parameters. Discrepancies in the experimental data of the same IL affect the quality of the correlation and thus the development of the predictive method. In this work, mathematical gnostics was used to analyze the experimental data from different sources and recommend one set of reliable data for each IL. These recommended data (totally 819 data points) for 70 ILs were correlated using this model to obtain an extended set of binary interaction parameters and ion VFT parameters, with a regression accuracy of 1.4%. In addition, 966 experimental viscosity data points for 11 binary mixtures of ILs were collected from literature to establish this model. All the binary data consist of 128 training data points used for the optimization of binary interaction parameters and 838 test data points used for the comparison of the pure evaluated values. The relative average absolute deviation (RAAD) for training and test is 2.9% and 3.9%, respectively.