Fuzzy Logic and Fuzzy Set Theory Based Edge Detection Algorithm
Main Article Content
Abstract
In this paper we will show a way how to detect edges in digital images. Edge detection is a fundamental part of many algorithms, both in image processing and in video processing. Therefore it is important that the algorithm is efficient and, if possible, fast to carry out. The fuzzy set theory based approach on edge detection is good for use when we need to make some kind of image segmentation, or when there is a need for edge classification (primary, secondary, ...). One example that motivated us is region labeling; this is a process by which the digital image is divided in units and each unit is given a unique label (sky, house, grass, …, etc.). To accomplish that, we need to have an intelligent system that will precisely determine the edges of the region. In this paper we will describe tools from image processing and fuzzy logic that we use for edge detection as well as the proposed algorithm.