• List of Articles


      • Open Access Article

        1 - Privacy Preserving Big Data Mining: Association Rule Hiding
        Golnar Assadat  Afzali shahriyar mohammadi
        Data repositories contain sensitive information which must be protected from unauthorized access. Existing data mining techniques can be considered as a privacy threat to sensitive data. Association rule mining is one of the utmost data mining techniques which tries to Full Text
        Data repositories contain sensitive information which must be protected from unauthorized access. Existing data mining techniques can be considered as a privacy threat to sensitive data. Association rule mining is one of the utmost data mining techniques which tries to cover relationships between seemingly unrelated data in a data base.. Association rule hiding is a research area in privacy preserving data mining (PPDM) which addresses a solution for hiding sensitive rules within the data problem. Many researches have be done in this area, but most of them focus on reducing undesired side effect of deleting sensitive association rules in static databases. However, in the age of big data, we confront with dynamic data bases with new data entrance at any time. So, most of existing techniques would not be practical and must be updated in order to be appropriate for these huge volume data bases. In this paper, data anonymization technique is used for association rule hiding, while parallelization and scalability features are also embedded in the proposed model, in order to speed up big data mining process. In this way, instead of removing some instances of an existing important association rule, generalization is used to anonymize items in appropriate level. So, if necessary, we can update important association rules based on the new data entrances. We have conducted some experiments using three datasets in order to evaluate performance of the proposed model in comparison with Max-Min2 and HSCRIL. Experimental results show that the information loss of the proposed model is less than existing researches in this area and this model can be executed in a parallel manner for less execution time Manuscript Document
      • Open Access Article

        2 - COGNISON: A Novel Dynamic Community Detection Algorithm in Social Network
        Hamideh Sadat Cheraghchi Ali Zakerolhossieni
        The problem of community detection has a long tradition in data mining area and has many challenging facet, especially when it comes to community detection in time-varying context. While recent studies argue the usability of social science disciplines for modern social Full Text
        The problem of community detection has a long tradition in data mining area and has many challenging facet, especially when it comes to community detection in time-varying context. While recent studies argue the usability of social science disciplines for modern social network analysis, we present a novel dynamic community detection algorithm called COGNISON inspired mainly by social theories. To be specific, we take inspiration from prototype theory and cognitive consistency theory to recognize the best community for each member by formulating community detection algorithm by human analogy disciplines. COGNISON is placed in representative based algorithm category and hints to further fortify the pure mathematical approach to community detection with stabilized social science disciplines. The proposed model is able to determine the proper number of communities by high accuracy in both weighted and binary networks. Comparison with the state of art algorithms proposed for dynamic community discovery in real datasets shows higher performance of this method in different measures of Accuracy, NMI, and Entropy for detecting communities over times. Finally our approach motivates the application of human inspired models in dynamic community detection context and suggest the fruitfulness of the connection of community detection field and social science theories to each other. Manuscript Document
      • Open Access Article

        3 - Analysis and Evaluation of Techniques for Myocardial Infarction Based on Genetic Algorithm and Weight by SVM
        hojatallah hamidi Atefeh Daraei
        Although decreasing rate of death in developed countries because of Myocardial Infarction, it is turned to the leading cause of death in developing countries. Data mining approaches can be utilized to predict occurrence of Myocardial Infarction. Because of the side effe Full Text
        Although decreasing rate of death in developed countries because of Myocardial Infarction, it is turned to the leading cause of death in developing countries. Data mining approaches can be utilized to predict occurrence of Myocardial Infarction. Because of the side effects of using Angioplasty as main method for diagnosing Myocardial Infarction, presenting a method for diagnosing MI before occurrence seems really important. This study aim to investigate prediction models for Myocardial Infarction, by applying a feature selection model based on Wight by SVM and genetic algorithm. In our proposed method, for improving the performance of classification algorithm, a hybrid feature selection method is applied. At first stage of this method, the features are selected based on their weights, using weight by Support Vector Machine. At second stage, the selected features, are given to genetic algorithm for final selection. After selecting appropriate features, eight classification methods, include Sequential Minimal Optimization, REPTree, Multi-layer Perceptron, Random Forest, K-Nearest Neighbors and Bayesian Network, are applied to predict occurrence of Myocardial Infarction. Finally, the best accuracy of applied classification algorithms, have achieved by Multi-layer Perceptron and Sequential Minimal Optimization. Manuscript Document
      • Open Access Article

        4 - Optimization of Random Phase Updating Technique for Effective Reduction in PAPR, Using Discrete Cosine Transform
        Babak Haji Bagher Naeeni
        One of problems of OFDM systems, is the big value of peak to average power ratio. To reduce it, any attempt have been done amongst which, random phase updating is an important technique. In contrast to paper, since power variance is computable before IFFT block, the com Full Text
        One of problems of OFDM systems, is the big value of peak to average power ratio. To reduce it, any attempt have been done amongst which, random phase updating is an important technique. In contrast to paper, since power variance is computable before IFFT block, the complexity of this method would be less than other phase injection methods which could be an important factor. Another interesting capability of random phase updating technique is the possibility of applying the variance of threshold power. The operation of phase injection is repeated till the power variance reaches threshold power variance. However, this may be a considered as a disadvantage for random phase updating technique. The reason is that reaching the mentioned threshold may lead to possible system delay. In this paper, in order to solve the mentioned problem, DCT transform is applied on subcarrier outputs before phase injection. This leads to reduce the number of required carriers for reaching the threshold value which results in reducing system delay accordingly. Manuscript Document
      • Open Access Article

        5 - Nonlinear State Estimation Using Hybrid Robust Cubature Kalman Filter
        Behrooz Safarinejadian Mohsen Taher
        In this paper, a novel filter is provided that estimates the states of any nonlinear system, both in the presence and absence of uncertainty with high accuracy. It is well understood that a robust filter design is a compromise between the robustness and the estimation a Full Text
        In this paper, a novel filter is provided that estimates the states of any nonlinear system, both in the presence and absence of uncertainty with high accuracy. It is well understood that a robust filter design is a compromise between the robustness and the estimation accuracy. In fact, a robust filter is designed to obtain an accurate and suitable performance in presence of modelling errors.So in the absence of any unknown or time-varying uncertainties, the robust filter does not provide the desired performance. The new method provided in this paper, which is named hybrid robust cubature Kalman filter (CKF), is constructed by combining a traditional CKF and a novel robust CKF. The novel robust CKF is designed by merging a traditional CKF with an uncertainty estimator so that it can provide the desired performance in the presence of uncertainty. Since the presence of uncertainty results in a large innovation value, the hybrid robust CKF adapts itself according to the value of the normalized innovation. The CKF and robust CKF filters are run in parallel and at any time, a suitable decision is taken to choose the estimated state of either the CKF or the robust CKF as the final state estimation. To validate the performance of the proposed filters, two examples are given that demonstrate their promising performance. Manuscript Document
      • Open Access Article

        6 - Quality Assessment Based Coded Apertures for Defocus Deblurring
        Mina Masoudifar Hamid Reza Pourreza
        A conventional camera with small size pixels may capture images with defocused blurred regions. Blurring, as a low-pass filter, attenuates or drops details of the captured image. This fact makes deblurring as an ill-posed problem. Coded aperture photography can decrease Full Text
        A conventional camera with small size pixels may capture images with defocused blurred regions. Blurring, as a low-pass filter, attenuates or drops details of the captured image. This fact makes deblurring as an ill-posed problem. Coded aperture photography can decrease destructive effects of blurring in defocused images. Hence, in this case, aperture patterns are designed or evaluated based on the manner of reduction of these effects. In this paper, a new function is presented that is applied for evaluating the aperture patterns which are designed for defocus deblurring. The proposed function consists of a weighted sum of two new criteria, which are defined based on spectral characteristics of an aperture pattern. On the basis of these criteria, a pattern whose spectral properties are more similar to a flat all-pass filter is assessed as a better pattern. The weights of these criteria are determined by a learning approach. An aggregate image quality assessment measure, including an existing perceptual metric and an objective metric, is used for determining the weights. According to the proposed evaluation function, a genetic algorithm that converges to a near-optimal binary aperture pattern is developed. In consequence, an asymmetric and a semi-symmetric pattern are proposed. The resulting patterns are compared with the circular aperture and some other patterns in different scenarios. Manuscript Document
      • Open Access Article

        7 - Design, Implementation and Evaluation of Multi-terminal Binary Decision Diagram based Binary Fuzzy Relations
        Hamid Alavi Toussi Bahram Sadeghi Bigham
        Elimination of redundancies in the memory representation is necessary for fast and efficient analysis of large sets of fuzzy data. In this work, we use MTBDDs as the underlying data-structure to represent fuzzy sets and binary fuzzy relations. This leads to elimination Full Text
        Elimination of redundancies in the memory representation is necessary for fast and efficient analysis of large sets of fuzzy data. In this work, we use MTBDDs as the underlying data-structure to represent fuzzy sets and binary fuzzy relations. This leads to elimination of redundancies in the representation, less computations, and faster analyses. We also extended a BDD package (BuDDy) to support MTBDDs in general and fuzzy sets and relations in particular. Representation and manipulation of MTBDD based fuzzy sets and binary fuzzy relations are described in this paper. These include design and implementation of different fuzzy operations such as max, min and max-min composition. In particular, an efficient algorithm for computing max-min composition is presented.Effectiveness of our MTBDD based implementation is shown by applying it on fuzzy connectedness and image segmentation problem. Compared to a base implementation, the running time of the MTBDD based implementation was faster (in our test cases) by a factor ranging from 2 to 27. Also, when the MTBDD based data-structure was employed, the memory needed to represent the final results was improved by a factor ranging from 37.9 to 265.5. We also describe our base implementation which is based on matrices. Manuscript Document
      • Open Access Article

        8 - Unsupervised Segmentation of Retinal Blood Vessels Using the Human Visual System Line Detection Model
        Mohsen Zardadi Nasser Mehrshad Seyyed Mohammad Razavi
        Retinal image assessment has been employed by the medical community for diagnosing vascular and non-vascular pathology. Computer based analysis of blood vessels in retinal images will help ophthalmologists monitor larger populations for vessel abnormalities. Automatic s Full Text
        Retinal image assessment has been employed by the medical community for diagnosing vascular and non-vascular pathology. Computer based analysis of blood vessels in retinal images will help ophthalmologists monitor larger populations for vessel abnormalities. Automatic segmentation of blood vessels from retinal images is the initial step of the computer based assessment for blood vessel anomalies. In this paper, a fast unsupervised method for automatic detection of blood vessels in retinal images is presented. In order to eliminate optic disc and background noise in the fundus images, a simple preprocessing technique is introduced. First, a newly devised method, based on a simple cell model of the human visual system (HVS) enhances the blood vessels in various directions. Then, an activity function is presented on simple cell responses. Next, an adaptive threshold is used as an unsupervised classifier and classifies each pixel as a vessel pixel or a non-vessel pixel to obtain a vessel binary image. Lastly, morphological post-processing is applied to eliminate exudates which are detected as blood vessels. The method was tested on two publicly available databases, DRIVE and STARE, which are frequently used for this purpose. The results demonstrate that the performance of the proposed algorithm is comparable with state-of-the-art techniques. Manuscript Document