You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1370 lines
73 KiB

1 month ago
  1. <?xml version="1.0"?>
  2. <doc>
  3. <assembly>
  4. <name>AForge.Vision</name>
  5. </assembly>
  6. <members>
  7. <member name="T:AForge.Vision.Motion.MotionAreaHighlighting">
  8. <summary>
  9. Motion processing algorithm, which highlights motion areas.
  10. </summary>
  11. <remarks><para>The aim of this motion processing algorithm is to highlight
  12. motion areas with grid pattern of the <see cref="P:AForge.Vision.Motion.MotionAreaHighlighting.HighlightColor">specified color</see>.
  13. </para>
  14. <para>Sample usage:</para>
  15. <code>
  16. // create motion detector
  17. MotionDetector detector = new MotionDetector(
  18. /* motion detection algorithm */,
  19. new MotionAreaHighlighting( ) );
  20. // continuously feed video frames to motion detector
  21. while ( ... )
  22. {
  23. // process new video frame
  24. detector.ProcessFrame( videoFrame );
  25. }
  26. </code>
  27. </remarks>
  28. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  29. <seealso cref="T:AForge.Vision.Motion.IMotionDetector"/>
  30. </member>
  31. <member name="T:AForge.Vision.Motion.IMotionProcessing">
  32. <summary>
  33. Interface of motion processing algorithm.
  34. </summary>
  35. <remarks><para>The interface specifies methods, which should be implemented
  36. by all motion processng algorithms - algorithm which perform further post processing
  37. of detected motion, which is done by motion detection algorithms (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).
  38. </para></remarks>
  39. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  40. <seealso cref="T:AForge.Vision.Motion.IMotionDetector"/>
  41. </member>
  42. <member name="M:AForge.Vision.Motion.IMotionProcessing.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)">
  43. <summary>
  44. Process video and motion frames doing further post processing after
  45. performed motion detection.
  46. </summary>
  47. <param name="videoFrame">Original video frame.</param>
  48. <param name="motionFrame">Motion frame provided by motion detection
  49. algorithm (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).</param>
  50. <remarks><para>The method does father post processing of detected motion.
  51. Type of motion post processing is specified by specific implementation
  52. of the <see cref="T:AForge.Vision.Motion.IMotionProcessing"/> interface - it may motion
  53. area highlighting, motion objects counting, etc.</para></remarks>
  54. </member>
  55. <member name="M:AForge.Vision.Motion.IMotionProcessing.Reset">
  56. <summary>
  57. Reset internal state of motion processing algorithm.
  58. </summary>
  59. <remarks><para>The method allows to reset internal state of motion processing
  60. algorithm and prepare it for processing of next video stream or to restart
  61. the algorithm.</para>
  62. <para><note>Some motion processing algorithms may not have any stored internal
  63. states and may just process provided video frames without relying on any motion
  64. history etc. In this case such algorithms provide empty implementation of this method.</note></para>
  65. </remarks>
  66. </member>
  67. <member name="M:AForge.Vision.Motion.MotionAreaHighlighting.#ctor">
  68. <summary>
  69. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionAreaHighlighting"/> class.
  70. </summary>
  71. </member>
  72. <member name="M:AForge.Vision.Motion.MotionAreaHighlighting.#ctor(System.Drawing.Color)">
  73. <summary>
  74. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionAreaHighlighting"/> class.
  75. </summary>
  76. <param name="highlightColor">Color used to highlight motion regions.</param>
  77. </member>
  78. <member name="M:AForge.Vision.Motion.MotionAreaHighlighting.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)">
  79. <summary>
  80. Process video and motion frames doing further post processing after
  81. performed motion detection.
  82. </summary>
  83. <param name="videoFrame">Original video frame.</param>
  84. <param name="motionFrame">Motion frame provided by motion detection
  85. algorithm (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).</param>
  86. <remarks><para>Processes provided motion frame and highlights motion areas
  87. on the original video frame with <see cref="P:AForge.Vision.Motion.MotionAreaHighlighting.HighlightColor">specified color</see>.</para>
  88. </remarks>
  89. <exception cref="T:AForge.Imaging.InvalidImagePropertiesException">Motion frame is not 8 bpp image, but it must be so.</exception>
  90. <exception cref="T:AForge.Imaging.UnsupportedImageFormatException">Video frame must be 8 bpp grayscale image or 24/32 bpp color image.</exception>
  91. </member>
  92. <member name="M:AForge.Vision.Motion.MotionAreaHighlighting.Reset">
  93. <summary>
  94. Reset internal state of motion processing algorithm.
  95. </summary>
  96. <remarks><para>The method allows to reset internal state of motion processing
  97. algorithm and prepare it for processing of next video stream or to restart
  98. the algorithm.</para></remarks>
  99. </member>
  100. <member name="P:AForge.Vision.Motion.MotionAreaHighlighting.HighlightColor">
  101. <summary>
  102. Color used to highlight motion regions.
  103. </summary>
  104. <remarks>
  105. <para>Default value is set to <b>red</b> color.</para>
  106. </remarks>
  107. </member>
  108. <member name="T:AForge.Vision.Motion.IMotionDetector">
  109. <summary>
  110. Interface of motion detector algorithm.
  111. </summary>
  112. <remarks><para>The interface specifies methods, which should be implemented
  113. by all motion detection algorithms - algorithms which perform processing of video
  114. frames in order to detect motion. Amount of detected motion may be checked using
  115. <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel"/> property. Also <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionFrame"/> property may
  116. be used in order to see all the detected motion areas. For example, the <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionFrame"/> property
  117. is used by motion processing algorithms for further motion post processing, like
  118. highlighting motion areas, counting number of detected moving object, etc.
  119. </para></remarks>
  120. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  121. <seealso cref="T:AForge.Vision.Motion.IMotionProcessing"/>
  122. </member>
  123. <member name="M:AForge.Vision.Motion.IMotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)">
  124. <summary>
  125. Process new video frame.
  126. </summary>
  127. <param name="videoFrame">Video frame to process (detect motion in).</param>
  128. <remarks><para>Processes new frame from video source and detects motion in it.</para></remarks>
  129. </member>
  130. <member name="M:AForge.Vision.Motion.IMotionDetector.Reset">
  131. <summary>
  132. Reset motion detector to initial state.
  133. </summary>
  134. <remarks><para>Resets internal state and variables of motion detection algorithm.
  135. Usually this is required to be done before processing new video source, but
  136. may be also done at any time to restart motion detection algorithm.</para>
  137. </remarks>
  138. </member>
  139. <member name="P:AForge.Vision.Motion.IMotionDetector.MotionLevel">
  140. <summary>
  141. Motion level value, [0, 1].
  142. </summary>
  143. <remarks><para>Amount of changes in the last processed frame. For example, if value of
  144. this property equals to 0.1, then it means that last processed frame has 10% of changes
  145. (however it is up to specific implementation to decide how to compare specified frame).</para>
  146. </remarks>
  147. </member>
  148. <member name="P:AForge.Vision.Motion.IMotionDetector.MotionFrame">
  149. <summary>
  150. Motion frame containing detected areas of motion.
  151. </summary>
  152. <remarks><para>Motion frame is a grayscale image, which shows areas of detected motion.
  153. All black pixels in the motion frame correspond to areas, where no motion is
  154. detected. But white pixels correspond to areas, where motion is detected.</para></remarks>
  155. </member>
  156. <member name="T:AForge.Vision.Motion.BlobCountingObjectsProcessing">
  157. <summary>
  158. Motion processing algorithm, which counts separate moving objects and highlights them.
  159. </summary>
  160. <remarks><para>The aim of this motion processing algorithm is to count separate objects
  161. in the motion frame, which is provided by <see cref="T:AForge.Vision.Motion.IMotionDetector">motion detection algorithm</see>.
  162. In the case if <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightMotionRegions"/> property is set to <see langword="true"/>,
  163. found objects are also highlighted on the original video frame. The algorithm
  164. counts and highlights only those objects, which size satisfies <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth"/>
  165. and <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight"/> properties.</para>
  166. <para><note>The motion processing algorithm is supposed to be used only with motion detection
  167. algorithms, which are based on finding difference with background frame
  168. (see <see cref="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector"/> and <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/>
  169. as simple implementations) and allow extract moving objects clearly.</note></para>
  170. <para>Sample usage:</para>
  171. <code>
  172. // create instance of motion detection algorithm
  173. IMotionDetector motionDetector = new ... ;
  174. // create instance of motion processing algorithm
  175. BlobCountingObjectsProcessing motionProcessing = new BlobCountingObjectsProcessing( );
  176. // create motion detector
  177. MotionDetector detector = new MotionDetector( motionDetector, motionProcessing );
  178. // continuously feed video frames to motion detector
  179. while ( ... )
  180. {
  181. // process new video frame and check motion level
  182. if ( detector.ProcessFrame( videoFrame ) &gt; 0.02 )
  183. {
  184. // check number of detected objects
  185. if ( motionProcessing.ObjectsCount &gt; 1 )
  186. {
  187. // ...
  188. }
  189. }
  190. }
  191. </code>
  192. </remarks>
  193. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  194. <seealso cref="T:AForge.Vision.Motion.IMotionDetector"/>
  195. </member>
  196. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.#ctor">
  197. <summary>
  198. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.BlobCountingObjectsProcessing"/> class.
  199. </summary>
  200. </member>
  201. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.#ctor(System.Boolean)">
  202. <summary>
  203. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.BlobCountingObjectsProcessing"/> class.
  204. </summary>
  205. <param name="highlightMotionRegions">Highlight motion regions or not (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightMotionRegions"/> property).</param>
  206. </member>
  207. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.#ctor(System.Int32,System.Int32)">
  208. <summary>
  209. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.BlobCountingObjectsProcessing"/> class.
  210. </summary>
  211. <param name="minWidth">Minimum width of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth"/> property).</param>
  212. <param name="minHeight">Minimum height of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight"/> property).</param>
  213. </member>
  214. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.#ctor(System.Int32,System.Int32,System.Drawing.Color)">
  215. <summary>
  216. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.BlobCountingObjectsProcessing"/> class.
  217. </summary>
  218. <param name="minWidth">Minimum width of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth"/> property).</param>
  219. <param name="minHeight">Minimum height of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight"/> property).</param>
  220. <param name="highlightColor">Color used to highlight motion regions.</param>
  221. </member>
  222. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.#ctor(System.Int32,System.Int32,System.Boolean)">
  223. <summary>
  224. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.BlobCountingObjectsProcessing"/> class.
  225. </summary>
  226. <param name="minWidth">Minimum width of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth"/> property).</param>
  227. <param name="minHeight">Minimum height of acceptable object (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight"/> property).</param>
  228. <param name="highlightMotionRegions">Highlight motion regions or not (see <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightMotionRegions"/> property).</param>
  229. </member>
  230. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)">
  231. <summary>
  232. Process video and motion frames doing further post processing after
  233. performed motion detection.
  234. </summary>
  235. <param name="videoFrame">Original video frame.</param>
  236. <param name="motionFrame">Motion frame provided by motion detection
  237. algorithm (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).</param>
  238. <remarks><para>Processes provided motion frame and counts number of separate
  239. objects, which size satisfies <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth"/> and <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight"/>
  240. properties. In the case if <see cref="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightMotionRegions"/> property is
  241. set to <see langword="true"/>, the found object are also highlighted on the
  242. original video frame.
  243. </para></remarks>
  244. <exception cref="T:AForge.Imaging.InvalidImagePropertiesException">Motion frame is not 8 bpp image, but it must be so.</exception>
  245. <exception cref="T:AForge.Imaging.UnsupportedImageFormatException">Video frame must be 8 bpp grayscale image or 24/32 bpp color image.</exception>
  246. </member>
  247. <member name="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.Reset">
  248. <summary>
  249. Reset internal state of motion processing algorithm.
  250. </summary>
  251. <remarks><para>The method allows to reset internal state of motion processing
  252. algorithm and prepare it for processing of next video stream or to restart
  253. the algorithm.</para></remarks>
  254. </member>
  255. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightMotionRegions">
  256. <summary>
  257. Highlight motion regions or not.
  258. </summary>
  259. <remarks><para>The property specifies if detected moving objects should be highlighted
  260. with rectangle or not.</para>
  261. <para>Default value is set to <see langword="true"/>.</para>
  262. <para><note>Turning the value on leads to additional processing time of video frame.</note></para>
  263. </remarks>
  264. </member>
  265. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.HighlightColor">
  266. <summary>
  267. Color used to highlight motion regions.
  268. </summary>
  269. <remarks>
  270. <para>Default value is set to <b>red</b> color.</para>
  271. </remarks>
  272. </member>
  273. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsWidth">
  274. <summary>
  275. Minimum width of acceptable object.
  276. </summary>
  277. <remarks><para>The property sets minimum width of an object to count and highlight. If
  278. objects have smaller width, they are not counted and are not highlighted.</para>
  279. <para>Default value is set to <b>10</b>.</para>
  280. </remarks>
  281. </member>
  282. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.MinObjectsHeight">
  283. <summary>
  284. Minimum height of acceptable object.
  285. </summary>
  286. <remarks><para>The property sets minimum height of an object to count and highlight. If
  287. objects have smaller height, they are not counted and are not highlighted.</para>
  288. <para>Default value is set to <b>10</b>.</para>
  289. </remarks>
  290. </member>
  291. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.ObjectsCount">
  292. <summary>
  293. Number of detected objects.
  294. </summary>
  295. <remarks><para>The property provides number of moving objects detected by
  296. the last call of <see cref="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)"/> method.</para></remarks>
  297. </member>
  298. <member name="P:AForge.Vision.Motion.BlobCountingObjectsProcessing.ObjectRectangles">
  299. <summary>
  300. Rectangles of moving objects.
  301. </summary>
  302. <remarks><para>The property provides array of moving objects' rectangles
  303. detected by the last call of <see cref="M:AForge.Vision.Motion.BlobCountingObjectsProcessing.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)"/> method.</para></remarks>
  304. </member>
  305. <member name="T:AForge.Vision.Motion.GridMotionAreaProcessing">
  306. <summary>
  307. Motion processing algorithm, which performs grid processing of motion frame.
  308. </summary>
  309. <remarks><para>The aim of this motion processing algorithm is to do grid processing
  310. of motion frame. This means that entire motion frame is divided by a grid into
  311. certain amount of cells and the motion level is calculated for each cell. The
  312. information about each cell's motion level may be retrieved using <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionGrid"/>
  313. property.</para>
  314. <para><para>In addition the algorithm can highlight those cells, which have motion
  315. level above the specified threshold (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionAmountToHighlight"/>
  316. property). To enable this it is required to set <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightMotionGrid"/>
  317. property to <see langword="true"/>.</para></para>
  318. <para>Sample usage:</para>
  319. <code>
  320. // create instance of motion detection algorithm
  321. IMotionDetector motionDetector = new ... ;
  322. // create instance of motion processing algorithm
  323. GridMotionAreaProcessing motionProcessing = new GridMotionAreaProcessing( 16, 16 );
  324. // create motion detector
  325. MotionDetector detector = new MotionDetector( motionDetector, motionProcessing );
  326. // continuously feed video frames to motion detector
  327. while ( ... )
  328. {
  329. // process new video frame
  330. detector.ProcessFrame( videoFrame );
  331. // check motion level in 5th row 8th column
  332. if ( motionProcessing.MotionGrid[5, 8] &gt; 0.15 )
  333. {
  334. // ...
  335. }
  336. }
  337. </code>
  338. </remarks>
  339. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  340. <seealso cref="T:AForge.Vision.Motion.IMotionDetector"/>
  341. </member>
  342. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.#ctor">
  343. <summary>
  344. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.GridMotionAreaProcessing"/> class.
  345. </summary>
  346. </member>
  347. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.#ctor(System.Int32,System.Int32)">
  348. <summary>
  349. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.GridMotionAreaProcessing"/> class.
  350. </summary>
  351. <param name="gridWidth">Width of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridWidth"/> property).</param>
  352. <param name="gridHeight">Height of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridHeight"/> property).</param>
  353. </member>
  354. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.#ctor(System.Int32,System.Int32,System.Boolean)">
  355. <summary>
  356. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.GridMotionAreaProcessing"/> class.
  357. </summary>
  358. <param name="gridWidth">Width of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridWidth"/> property).</param>
  359. <param name="gridHeight">Height of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridHeight"/> property).</param>
  360. <param name="highlightMotionGrid">Highlight motion regions or not (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightMotionGrid"/> property).</param>
  361. </member>
  362. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.#ctor(System.Int32,System.Int32,System.Boolean,System.Single)">
  363. <summary>
  364. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.GridMotionAreaProcessing"/> class.
  365. </summary>
  366. <param name="gridWidth">Width of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridWidth"/> property).</param>
  367. <param name="gridHeight">Height of motion grid (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridHeight"/> property).</param>
  368. <param name="highlightMotionGrid">Highlight motion regions or not (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightMotionGrid"/> property).</param>
  369. <param name="motionAmountToHighlight">Motion amount to highlight cell (see <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionAmountToHighlight"/> property).</param>
  370. </member>
  371. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)">
  372. <summary>
  373. Process video and motion frames doing further post processing after
  374. performed motion detection.
  375. </summary>
  376. <param name="videoFrame">Original video frame.</param>
  377. <param name="motionFrame">Motion frame provided by motion detection
  378. algorithm (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).</param>
  379. <remarks><para>Processes provided motion frame and calculates motion level
  380. for each grid's cell. In the case if <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightMotionGrid"/> property is
  381. set to <see langword="true"/>, the cell with motion level above threshold are
  382. highlighted.</para></remarks>
  383. <exception cref="T:AForge.Imaging.InvalidImagePropertiesException">Motion frame is not 8 bpp image, but it must be so.</exception>
  384. <exception cref="T:AForge.Imaging.UnsupportedImageFormatException">Video frame must be 8 bpp grayscale image or 24/32 bpp color image.</exception>
  385. </member>
  386. <member name="M:AForge.Vision.Motion.GridMotionAreaProcessing.Reset">
  387. <summary>
  388. Reset internal state of motion processing algorithm.
  389. </summary>
  390. <remarks><para>The method allows to reset internal state of motion processing
  391. algorithm and prepare it for processing of next video stream or to restart
  392. the algorithm.</para></remarks>
  393. </member>
  394. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightColor">
  395. <summary>
  396. Color used to highlight motion regions.
  397. </summary>
  398. <remarks>
  399. <para>Default value is set to <b>red</b> color.</para>
  400. </remarks>
  401. </member>
  402. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.HighlightMotionGrid">
  403. <summary>
  404. Highlight motion regions or not.
  405. </summary>
  406. <remarks><para>The property specifies if motion grid should be highlighted -
  407. if cell, which have motion level above the
  408. <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionAmountToHighlight">specified value</see>, should be highlighted.</para>
  409. <para>Default value is set to <see langword="true"/>.</para>
  410. <para><note>Turning the value on leads to additional processing time of video frame.</note></para>
  411. </remarks>
  412. </member>
  413. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionAmountToHighlight">
  414. <summary>
  415. Motion amount to highlight cell.
  416. </summary>
  417. <remarks><para>The property specifies motion level threshold for highlighting grid's
  418. cells. If motion level of a certain cell is higher than this value, then the cell
  419. is highlighted.</para>
  420. <para>Default value is set to <b>0.15</b>.</para>
  421. </remarks>
  422. </member>
  423. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.MotionGrid">
  424. <summary>
  425. Motion levels of each grid's cell.
  426. </summary>
  427. <remarks><para>The property represents an array of size
  428. <see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridHeight"/>x<see cref="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridWidth"/>, which keeps motion level
  429. of each grid's cell. If certain cell has motion level equal to 0.2, then it
  430. means that this cell has 20% of changes.</para>
  431. </remarks>
  432. </member>
  433. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridWidth">
  434. <summary>
  435. Width of motion grid, [2, 64].
  436. </summary>
  437. <remarks><para>The property specifies motion grid's width - number of grid' columns.</para>
  438. <para>Default value is set to <b>16</b>.</para>
  439. </remarks>
  440. </member>
  441. <member name="P:AForge.Vision.Motion.GridMotionAreaProcessing.GridHeight">
  442. <summary>
  443. Height of motion grid, [2, 64].
  444. </summary>
  445. <remarks><para>The property specifies motion grid's height - number of grid' rows.</para>
  446. <para>Default value is set to <b>16</b>.</para>
  447. </remarks>
  448. </member>
  449. <member name="T:AForge.Vision.Motion.MotionDetector">
  450. <summary>
  451. Motion detection wrapper class, which performs motion detection and processing.
  452. </summary>
  453. <remarks><para>The class serves as a wrapper class for
  454. <see cref="T:AForge.Vision.Motion.IMotionDetector">motion detection</see> and
  455. <see cref="T:AForge.Vision.Motion.IMotionProcessing">motion processing</see> algorithms, allowing to call them with
  456. single call. Unlike motion detection and motion processing interfaces, the class also
  457. provides additional methods for convenience, so the algorithms could be applied not
  458. only to <see cref="T:AForge.Imaging.UnmanagedImage"/>, but to .NET's <see cref="T:System.Drawing.Bitmap"/> class
  459. as well.</para>
  460. <para>In addition to wrapping of motion detection and processing algorthms, the class provides
  461. some additional functionality. Using <see cref="P:AForge.Vision.Motion.MotionDetector.MotionZones"/> property it is possible to specify
  462. set of rectangular zones to observe - only motion in these zones is counted and post procesed.</para>
  463. <para>Sample usage:</para>
  464. <code>
  465. // create motion detector
  466. MotionDetector detector = new MotionDetector(
  467. new SimpleBackgroundModelingDetector( ),
  468. new MotionAreaHighlighting( ) );
  469. // continuously feed video frames to motion detector
  470. while ( ... )
  471. {
  472. // process new video frame and check motion level
  473. if ( detector.ProcessFrame( videoFrame ) &gt; 0.02 )
  474. {
  475. // ring alarm or do somethng else
  476. }
  477. }
  478. </code>
  479. </remarks>
  480. </member>
  481. <member name="M:AForge.Vision.Motion.MotionDetector.#ctor(AForge.Vision.Motion.IMotionDetector)">
  482. <summary>
  483. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionDetector"/> class.
  484. </summary>
  485. <param name="detector">Motion detection algorithm to apply to each video frame.</param>
  486. </member>
  487. <member name="M:AForge.Vision.Motion.MotionDetector.#ctor(AForge.Vision.Motion.IMotionDetector,AForge.Vision.Motion.IMotionProcessing)">
  488. <summary>
  489. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionDetector"/> class.
  490. </summary>
  491. <param name="detector">Motion detection algorithm to apply to each video frame.</param>
  492. <param name="processor">Motion processing algorithm to apply to each video frame after
  493. motion detection is done.</param>
  494. </member>
  495. <member name="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(System.Drawing.Bitmap)">
  496. <summary>
  497. Process new video frame.
  498. </summary>
  499. <param name="videoFrame">Video frame to process (detect motion in).</param>
  500. <returns>Returns amount of motion, which is provided <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel"/>
  501. property of the <see cref="P:AForge.Vision.Motion.MotionDetector.MotionDetectionAlgorithm">motion detection algorithm in use</see>.</returns>
  502. <remarks><para>See <see cref="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> for additional details.</para>
  503. </remarks>
  504. </member>
  505. <member name="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(System.Drawing.Imaging.BitmapData)">
  506. <summary>
  507. Process new video frame.
  508. </summary>
  509. <param name="videoFrame">Video frame to process (detect motion in).</param>
  510. <returns>Returns amount of motion, which is provided <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel"/>
  511. property of the <see cref="P:AForge.Vision.Motion.MotionDetector.MotionDetectionAlgorithm">motion detection algorithm in use</see>.</returns>
  512. <remarks><para>See <see cref="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> for additional details.</para>
  513. </remarks>
  514. </member>
  515. <member name="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)">
  516. <summary>
  517. Process new video frame.
  518. </summary>
  519. <param name="videoFrame">Video frame to process (detect motion in).</param>
  520. <returns>Returns amount of motion, which is provided <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel"/>
  521. property of the <see cref="P:AForge.Vision.Motion.MotionDetector.MotionDetectionAlgorithm">motion detection algorithm in use</see>.</returns>
  522. <remarks><para>The method first of all applies motion detection algorithm to the specified video
  523. frame to calculate <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel">motion level</see> and
  524. <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionFrame">motion frame</see>. After this it applies motion processing algorithm
  525. (if it was set) to do further post processing, like highlighting motion areas, counting moving
  526. objects, etc.</para>
  527. <para><note>In the case if <see cref="P:AForge.Vision.Motion.MotionDetector.MotionZones"/> property is set, this method will perform
  528. motion filtering right after motion algorithm is done and before passing motion frame to motion
  529. processing algorithm. The method does filtering right on the motion frame, which is produced
  530. by motion detection algorithm. At the same time the method recalculates motion level and returns
  531. new value, which takes motion zones into account (but the new value is not set back to motion detection
  532. algorithm' <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel"/> property).
  533. </note></para>
  534. </remarks>
  535. </member>
  536. <member name="M:AForge.Vision.Motion.MotionDetector.Reset">
  537. <summary>
  538. Reset motion detector to initial state.
  539. </summary>
  540. <remarks><para>The method resets motion detection and motion processing algotithms by calling
  541. their <see cref="M:AForge.Vision.Motion.IMotionDetector.Reset"/> and <see cref="M:AForge.Vision.Motion.IMotionProcessing.Reset"/> methods.</para>
  542. </remarks>
  543. </member>
  544. <member name="P:AForge.Vision.Motion.MotionDetector.MotionDetectionAlgorithm">
  545. <summary>
  546. Motion detection algorithm to apply to each video frame.
  547. </summary>
  548. <remarks><para>The property sets motion detection algorithm, which is used by
  549. <see cref="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> method in order to calculate
  550. <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionLevel">motion level</see> and
  551. <see cref="P:AForge.Vision.Motion.IMotionDetector.MotionFrame">motion frame</see>.
  552. </para></remarks>
  553. </member>
  554. <member name="P:AForge.Vision.Motion.MotionDetector.MotionProcessingAlgorithm">
  555. <summary>
  556. Motion processing algorithm to apply to each video frame after
  557. motion detection is done.
  558. </summary>
  559. <remarks><para>The property sets motion processing algorithm, which is used by
  560. <see cref="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> method after motion detection in order to do further
  561. post processing of motion frames. The aim of further post processing depends on
  562. actual implementation of the specified motion processing algorithm - it can be
  563. highlighting of motion area, objects counting, etc.
  564. </para></remarks>
  565. </member>
  566. <member name="P:AForge.Vision.Motion.MotionDetector.MotionZones">
  567. <summary>
  568. Set of zones to detect motion in.
  569. </summary>
  570. <remarks><para>The property keeps array of rectangular zones, which are observed for motion detection.
  571. Motion outside of these zones is ignored.</para>
  572. <para>In the case if this property is set, the <see cref="M:AForge.Vision.Motion.MotionDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> method
  573. will filter out all motion witch was detected by motion detection algorithm, but is not
  574. located in the specified zones.</para>
  575. </remarks>
  576. </member>
  577. <member name="T:AForge.Vision.Motion.TwoFramesDifferenceDetector">
  578. <summary>
  579. Motion detector based on two continues frames difference.
  580. </summary>
  581. <remarks><para>The class implements the simplest motion detection algorithm, which is
  582. based on difference of two continues frames. The <see cref="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.MotionFrame">difference frame</see>
  583. is thresholded and the <see cref="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.MotionLevel">amount of difference pixels</see> is calculated.
  584. To suppress stand-alone noisy pixels erosion morphological operator may be applied, which
  585. is controlled by <see cref="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.SuppressNoise"/> property.</para>
  586. <para>Although the class may be used on its own to perform motion detection, it is preferred
  587. to use it in conjunction with <see cref="T:AForge.Vision.Motion.MotionDetector"/> class, which provides additional
  588. features and allows to use moton post processing algorithms.</para>
  589. <para>Sample usage:</para>
  590. <code>
  591. // create motion detector
  592. MotionDetector detector = new MotionDetector(
  593. new TwoFramesDifferenceDetector( ),
  594. new MotionAreaHighlighting( ) );
  595. // continuously feed video frames to motion detector
  596. while ( ... )
  597. {
  598. // process new video frame and check motion level
  599. if ( detector.ProcessFrame( videoFrame ) &gt; 0.02 )
  600. {
  601. // ring alarm or do somethng else
  602. }
  603. }
  604. </code>
  605. </remarks>
  606. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  607. </member>
  608. <member name="M:AForge.Vision.Motion.TwoFramesDifferenceDetector.#ctor">
  609. <summary>
  610. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.TwoFramesDifferenceDetector"/> class.
  611. </summary>
  612. </member>
  613. <member name="M:AForge.Vision.Motion.TwoFramesDifferenceDetector.#ctor(System.Boolean)">
  614. <summary>
  615. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.TwoFramesDifferenceDetector"/> class.
  616. </summary>
  617. <param name="suppressNoise">Suppress noise in video frames or not (see <see cref="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.SuppressNoise"/> property).</param>
  618. </member>
  619. <member name="M:AForge.Vision.Motion.TwoFramesDifferenceDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)">
  620. <summary>
  621. Process new video frame.
  622. </summary>
  623. <param name="videoFrame">Video frame to process (detect motion in).</param>
  624. <remarks><para>Processes new frame from video source and detects motion in it.</para>
  625. <para>Check <see cref="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.MotionLevel"/> property to get information about amount of motion
  626. (changes) in the processed frame.</para>
  627. </remarks>
  628. </member>
  629. <member name="M:AForge.Vision.Motion.TwoFramesDifferenceDetector.Reset">
  630. <summary>
  631. Reset motion detector to initial state.
  632. </summary>
  633. <remarks><para>Resets internal state and variables of motion detection algorithm.
  634. Usually this is required to be done before processing new video source, but
  635. may be also done at any time to restart motion detection algorithm.</para>
  636. </remarks>
  637. </member>
  638. <member name="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.DifferenceThreshold">
  639. <summary>
  640. Difference threshold value, [1, 255].
  641. </summary>
  642. <remarks><para>The value specifies the amount off difference between pixels, which is treated
  643. as motion pixel.</para>
  644. <para>Default value is set to <b>15</b>.</para>
  645. </remarks>
  646. </member>
  647. <member name="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.MotionLevel">
  648. <summary>
  649. Motion level value, [0, 1].
  650. </summary>
  651. <remarks><para>Amount of changes in the last processed frame. For example, if value of
  652. this property equals to 0.1, then it means that last processed frame has 10% difference
  653. with previous frame.</para>
  654. </remarks>
  655. </member>
  656. <member name="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.MotionFrame">
  657. <summary>
  658. Motion frame containing detected areas of motion.
  659. </summary>
  660. <remarks><para>Motion frame is a grayscale image, which shows areas of detected motion.
  661. All black pixels in the motion frame correspond to areas, where no motion is
  662. detected. But white pixels correspond to areas, where motion is detected.</para>
  663. <para><note>The property is set to <see langword="null"/> after processing of the first
  664. video frame by the algorithm.</note></para>
  665. </remarks>
  666. </member>
  667. <member name="P:AForge.Vision.Motion.TwoFramesDifferenceDetector.SuppressNoise">
  668. <summary>
  669. Suppress noise in video frames or not.
  670. </summary>
  671. <remarks><para>The value specifies if additional filtering should be
  672. done to suppress standalone noisy pixels by applying 3x3 erosion image processing
  673. filter.</para>
  674. <para>Default value is set to <see langword="true"/>.</para>
  675. <para><note>Turning the value on leads to more processing time of video frame.</note></para>
  676. </remarks>
  677. </member>
  678. <member name="T:AForge.Vision.Motion.MotionBorderHighlighting">
  679. <summary>
  680. Motion processing algorithm, which highlights border of motion areas.
  681. </summary>
  682. <remarks><para>The aim of this motion processing algorithm is to highlight
  683. borders of motion areas with the <see cref="P:AForge.Vision.Motion.MotionBorderHighlighting.HighlightColor">specified color</see>.
  684. </para>
  685. <para><note>The motion processing algorithm is supposed to be used only with motion detection
  686. algorithms, which are based on finding difference with background frame
  687. (see <see cref="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector"/> and <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/>
  688. as simple implementations) and allow extract moving objects clearly.</note></para>
  689. <para>Sample usage:</para>
  690. <code>
  691. // create motion detector
  692. MotionDetector detector = new MotionDetector(
  693. /* motion detection algorithm */,
  694. new MotionBorderHighlighting( ) );
  695. // continuously feed video frames to motion detector
  696. while ( ... )
  697. {
  698. // process new video frame
  699. detector.ProcessFrame( videoFrame );
  700. }
  701. </code>
  702. </remarks>
  703. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  704. <seealso cref="T:AForge.Vision.Motion.IMotionDetector"/>
  705. </member>
  706. <member name="M:AForge.Vision.Motion.MotionBorderHighlighting.#ctor">
  707. <summary>
  708. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionBorderHighlighting"/> class.
  709. </summary>
  710. </member>
  711. <member name="M:AForge.Vision.Motion.MotionBorderHighlighting.#ctor(System.Drawing.Color)">
  712. <summary>
  713. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.MotionBorderHighlighting"/> class.
  714. </summary>
  715. <param name="highlightColor">Color used to highlight motion regions.</param>
  716. </member>
  717. <member name="M:AForge.Vision.Motion.MotionBorderHighlighting.ProcessFrame(AForge.Imaging.UnmanagedImage,AForge.Imaging.UnmanagedImage)">
  718. <summary>
  719. Process video and motion frames doing further post processing after
  720. performed motion detection.
  721. </summary>
  722. <param name="videoFrame">Original video frame.</param>
  723. <param name="motionFrame">Motion frame provided by motion detection
  724. algorithm (see <see cref="T:AForge.Vision.Motion.IMotionDetector"/>).</param>
  725. <remarks><para>Processes provided motion frame and highlights borders of motion areas
  726. on the original video frame with <see cref="P:AForge.Vision.Motion.MotionBorderHighlighting.HighlightColor">specified color</see>.</para>
  727. </remarks>
  728. <exception cref="T:AForge.Imaging.InvalidImagePropertiesException">Motion frame is not 8 bpp image, but it must be so.</exception>
  729. <exception cref="T:AForge.Imaging.UnsupportedImageFormatException">Video frame must be 8 bpp grayscale image or 24/32 bpp color image.</exception>
  730. </member>
  731. <member name="M:AForge.Vision.Motion.MotionBorderHighlighting.Reset">
  732. <summary>
  733. Reset internal state of motion processing algorithm.
  734. </summary>
  735. <remarks><para>The method allows to reset internal state of motion processing
  736. algorithm and prepare it for processing of next video stream or to restart
  737. the algorithm.</para></remarks>
  738. </member>
  739. <member name="P:AForge.Vision.Motion.MotionBorderHighlighting.HighlightColor">
  740. <summary>
  741. Color used to highlight motion regions.
  742. </summary>
  743. <remarks>
  744. <para>Default value is set to <b>red</b> color.</para>
  745. </remarks>
  746. </member>
  747. <member name="T:AForge.Vision.Motion.CustomFrameDifferenceDetector">
  748. <summary>
  749. Motion detector based on difference with predefined background frame.
  750. </summary>
  751. <remarks><para>The class implements motion detection algorithm, which is based on
  752. difference of current video frame with predefined background frame. The <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.MotionFrame">difference frame</see>
  753. is thresholded and the <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.MotionLevel">amount of difference pixels</see> is calculated.
  754. To suppress stand-alone noisy pixels erosion morphological operator may be applied, which
  755. is controlled by <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.SuppressNoise"/> property.</para>
  756. <para><note>In the case if precise motion area's borders are required (for example,
  757. for further motion post processing), then <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.KeepObjectsEdges"/> property
  758. may be used to restore borders after noise suppression.</note></para>
  759. <para><note>In the case if custom background frame is not specified by using
  760. <see cref="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(System.Drawing.Bitmap)"/> method, the algorithm takes first video frame
  761. as a background frame and calculates difference of further video frames with it.</note></para>
  762. <para>Unlike <see cref="T:AForge.Vision.Motion.TwoFramesDifferenceDetector"/> motion detection algorithm, this algorithm
  763. allows to identify quite clearly all objects, which are not part of the background (scene) -
  764. most likely moving objects.</para>
  765. <para>Sample usage:</para>
  766. <code>
  767. // create motion detector
  768. MotionDetector detector = new MotionDetector(
  769. new CustomFrameDifferenceDetector( ),
  770. new MotionAreaHighlighting( ) );
  771. // continuously feed video frames to motion detector
  772. while ( ... )
  773. {
  774. // process new video frame and check motion level
  775. if ( detector.ProcessFrame( videoFrame ) &gt; 0.02 )
  776. {
  777. // ring alarm or do somethng else
  778. }
  779. }
  780. </code>
  781. </remarks>
  782. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  783. </member>
  784. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.#ctor">
  785. <summary>
  786. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/> class.
  787. </summary>
  788. </member>
  789. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.#ctor(System.Boolean)">
  790. <summary>
  791. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/> class.
  792. </summary>
  793. <param name="suppressNoise">Suppress noise in video frames or not (see <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.SuppressNoise"/> property).</param>
  794. </member>
  795. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.#ctor(System.Boolean,System.Boolean)">
  796. <summary>
  797. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/> class.
  798. </summary>
  799. <param name="suppressNoise">Suppress noise in video frames or not (see <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.SuppressNoise"/> property).</param>
  800. <param name="keepObjectEdges">Restore objects edges after noise suppression or not (see <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.KeepObjectsEdges"/> property).</param>
  801. </member>
  802. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)">
  803. <summary>
  804. Process new video frame.
  805. </summary>
  806. <param name="videoFrame">Video frame to process (detect motion in).</param>
  807. <remarks><para>Processes new frame from video source and detects motion in it.</para>
  808. <para>Check <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.MotionLevel"/> property to get information about amount of motion
  809. (changes) in the processed frame.</para>
  810. </remarks>
  811. </member>
  812. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.Reset">
  813. <summary>
  814. Reset motion detector to initial state.
  815. </summary>
  816. <remarks><para>Resets internal state and variables of motion detection algorithm.
  817. Usually this is required to be done before processing new video source, but
  818. may be also done at any time to restart motion detection algorithm.</para>
  819. <para><note>In the case if custom background frame was set using
  820. <see cref="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(System.Drawing.Bitmap)"/> method, this method does not reset it.
  821. The method resets only automatically generated background frame.
  822. </note></para>
  823. </remarks>
  824. </member>
  825. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(System.Drawing.Bitmap)">
  826. <summary>
  827. Set background frame.
  828. </summary>
  829. <param name="backgroundFrame">Background frame to set.</param>
  830. <remarks><para>The method sets background frame, which will be used to calculate
  831. difference with.</para></remarks>
  832. </member>
  833. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(System.Drawing.Imaging.BitmapData)">
  834. <summary>
  835. Set background frame.
  836. </summary>
  837. <param name="backgroundFrame">Background frame to set.</param>
  838. <remarks><para>The method sets background frame, which will be used to calculate
  839. difference with.</para></remarks>
  840. </member>
  841. <member name="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(AForge.Imaging.UnmanagedImage)">
  842. <summary>
  843. Set background frame.
  844. </summary>
  845. <param name="backgroundFrame">Background frame to set.</param>
  846. <remarks><para>The method sets background frame, which will be used to calculate
  847. difference with.</para></remarks>
  848. </member>
  849. <member name="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.DifferenceThreshold">
  850. <summary>
  851. Difference threshold value, [1, 255].
  852. </summary>
  853. <remarks><para>The value specifies the amount off difference between pixels, which is treated
  854. as motion pixel.</para>
  855. <para>Default value is set to <b>15</b>.</para>
  856. </remarks>
  857. </member>
  858. <member name="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.MotionLevel">
  859. <summary>
  860. Motion level value, [0, 1].
  861. </summary>
  862. <remarks><para>Amount of changes in the last processed frame. For example, if value of
  863. this property equals to 0.1, then it means that last processed frame has 10% difference
  864. with defined background frame.</para>
  865. </remarks>
  866. </member>
  867. <member name="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.MotionFrame">
  868. <summary>
  869. Motion frame containing detected areas of motion.
  870. </summary>
  871. <remarks><para>Motion frame is a grayscale image, which shows areas of detected motion.
  872. All black pixels in the motion frame correspond to areas, where no motion is
  873. detected. But white pixels correspond to areas, where motion is detected.</para>
  874. <para><note>The property is set to <see langword="null"/> after processing of the first
  875. video frame by the algorithm in the case if custom background frame was not set manually
  876. by using <see cref="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.SetBackgroundFrame(System.Drawing.Bitmap)"/> method (it will be not <see langword="null"/>
  877. after second call in this case). If correct custom background
  878. was set then the property should bet set to estimated motion frame after
  879. <see cref="M:AForge.Vision.Motion.CustomFrameDifferenceDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)"/> method call.</note></para>
  880. </remarks>
  881. </member>
  882. <member name="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.SuppressNoise">
  883. <summary>
  884. Suppress noise in video frames or not.
  885. </summary>
  886. <remarks><para>The value specifies if additional filtering should be
  887. done to suppress standalone noisy pixels by applying 3x3 erosion image processing
  888. filter. See <see cref="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.KeepObjectsEdges"/> property, if it is required to restore
  889. edges of objects, which are not noise.</para>
  890. <para>Default value is set to <see langword="true"/>.</para>
  891. <para><note>Turning the value on leads to more processing time of video frame.</note></para>
  892. </remarks>
  893. </member>
  894. <member name="P:AForge.Vision.Motion.CustomFrameDifferenceDetector.KeepObjectsEdges">
  895. <summary>
  896. Restore objects edges after noise suppression or not.
  897. </summary>
  898. <remarks><para>The value specifies if additional filtering should be done
  899. to restore objects' edges after noise suppression by applying 3x3 dilatation
  900. image processing filter.</para>
  901. <para>Default value is set to <see langword="false"/>.</para>
  902. <para><note>Turning the value on leads to more processing time of video frame.</note></para>
  903. </remarks>
  904. </member>
  905. <member name="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector">
  906. <summary>
  907. Motion detector based on simple background modeling.
  908. </summary>
  909. <remarks><para>The class implements motion detection algorithm, which is based on
  910. difference of current video frame with modeled background frame.
  911. The <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MotionFrame">difference frame</see> is thresholded and the
  912. <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MotionLevel">amount of difference pixels</see> is calculated.
  913. To suppress stand-alone noisy pixels erosion morphological operator may be applied, which
  914. is controlled by <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.SuppressNoise"/> property.</para>
  915. <para><note>In the case if precise motion area's borders are required (for example,
  916. for further motion post processing), then <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.KeepObjectsEdges"/> property
  917. may be used to restore borders after noise suppression.</note></para>
  918. <para>As the first approximation of background frame, the first frame of video stream is taken.
  919. During further video processing the background frame is constantly updated, so it
  920. changes in the direction to decrease difference with current video frame (the background
  921. frame is moved towards current frame). See <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.FramesPerBackgroundUpdate"/>
  922. <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MillisecondsPerBackgroundUpdate"/> properties, which control the rate of
  923. background frame update.</para>
  924. <para>Unlike <see cref="T:AForge.Vision.Motion.TwoFramesDifferenceDetector"/> motion detection algorithm, this algorithm
  925. allows to identify quite clearly all objects, which are not part of the background (scene) -
  926. most likely moving objects. And unlike <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/> motion
  927. detection algorithm, this algorithm includes background adaptation feature, which allows it
  928. to update its modeled background frame in order to take scene changes into account.</para>
  929. <para><note>Because of the adaptation feature of the algorithm, it may adopt
  930. to background changes, what <see cref="T:AForge.Vision.Motion.CustomFrameDifferenceDetector"/> algorithm can not do.
  931. However, if moving object stays on the scene for a while (so algorithm adopts to it and does
  932. not treat it as a new moving object any more) and then starts to move again, the algorithm may
  933. find two moving objects - the true one, which is really moving, and the false one, which does not (the
  934. place, where the object stayed for a while).</note></para>
  935. <para><note>The algorithm is not applicable to such cases, when moving object resides
  936. in camera's view most of the time (laptops camera monitoring a person sitting in front of it,
  937. for example). The algorithm is mostly supposed for cases, when camera monitors some sort
  938. of static scene, where moving objects appear from time to time - street, road, corridor, etc.
  939. </note></para>
  940. <para>Sample usage:</para>
  941. <code>
  942. // create motion detector
  943. MotionDetector detector = new MotionDetector(
  944. new SimpleBackgroundModelingDetector( ),
  945. new MotionAreaHighlighting( ) );
  946. // continuously feed video frames to motion detector
  947. while ( ... )
  948. {
  949. // process new video frame and check motion level
  950. if ( detector.ProcessFrame( videoFrame ) &gt; 0.02 )
  951. {
  952. // ring alarm or do somethng else
  953. }
  954. }
  955. </code>
  956. </remarks>
  957. <seealso cref="T:AForge.Vision.Motion.MotionDetector"/>
  958. </member>
  959. <member name="M:AForge.Vision.Motion.SimpleBackgroundModelingDetector.#ctor">
  960. <summary>
  961. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector"/> class.
  962. </summary>
  963. </member>
  964. <member name="M:AForge.Vision.Motion.SimpleBackgroundModelingDetector.#ctor(System.Boolean)">
  965. <summary>
  966. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector"/> class.
  967. </summary>
  968. <param name="suppressNoise">Suppress noise in video frames or not (see <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.SuppressNoise"/> property).</param>
  969. </member>
  970. <member name="M:AForge.Vision.Motion.SimpleBackgroundModelingDetector.#ctor(System.Boolean,System.Boolean)">
  971. <summary>
  972. Initializes a new instance of the <see cref="T:AForge.Vision.Motion.SimpleBackgroundModelingDetector"/> class.
  973. </summary>
  974. <param name="suppressNoise">Suppress noise in video frames or not (see <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.SuppressNoise"/> property).</param>
  975. <param name="keepObjectEdges">Restore objects edges after noise suppression or not (see <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.KeepObjectsEdges"/> property).</param>
  976. </member>
  977. <member name="M:AForge.Vision.Motion.SimpleBackgroundModelingDetector.ProcessFrame(AForge.Imaging.UnmanagedImage)">
  978. <summary>
  979. Process new video frame.
  980. </summary>
  981. <param name="videoFrame">Video frame to process (detect motion in).</param>
  982. <remarks><para>Processes new frame from video source and detects motion in it.</para>
  983. <para>Check <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MotionLevel"/> property to get information about amount of motion
  984. (changes) in the processed frame.</para>
  985. </remarks>
  986. </member>
  987. <member name="M:AForge.Vision.Motion.SimpleBackgroundModelingDetector.Reset">
  988. <summary>
  989. Reset motion detector to initial state.
  990. </summary>
  991. <remarks><para>Resets internal state and variables of motion detection algorithm.
  992. Usually this is required to be done before processing new video source, but
  993. may be also done at any time to restart motion detection algorithm.</para>
  994. </remarks>
  995. </member>
  996. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.DifferenceThreshold">
  997. <summary>
  998. Difference threshold value, [1, 255].
  999. </summary>
  1000. <remarks><para>The value specifies the amount off difference between pixels, which is treated
  1001. as motion pixel.</para>
  1002. <para>Default value is set to <b>15</b>.</para>
  1003. </remarks>
  1004. </member>
  1005. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MotionLevel">
  1006. <summary>
  1007. Motion level value, [0, 1].
  1008. </summary>
  1009. <remarks><para>Amount of changes in the last processed frame. For example, if value of
  1010. this property equals to 0.1, then it means that last processed frame has 10% difference
  1011. with modeled background frame.</para>
  1012. </remarks>
  1013. </member>
  1014. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MotionFrame">
  1015. <summary>
  1016. Motion frame containing detected areas of motion.
  1017. </summary>
  1018. <remarks><para>Motion frame is a grayscale image, which shows areas of detected motion.
  1019. All black pixels in the motion frame correspond to areas, where no motion is
  1020. detected. But white pixels correspond to areas, where motion is detected.</para>
  1021. <para><note>The property is set to <see langword="null"/> after processing of the first
  1022. video frame by the algorithm.</note></para>
  1023. </remarks>
  1024. </member>
  1025. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.SuppressNoise">
  1026. <summary>
  1027. Suppress noise in video frames or not.
  1028. </summary>
  1029. <remarks><para>The value specifies if additional filtering should be
  1030. done to suppress standalone noisy pixels by applying 3x3 erosion image processing
  1031. filter. See <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.KeepObjectsEdges"/> property, if it is required to restore
  1032. edges of objects, which are not noise.</para>
  1033. <para>Default value is set to <see langword="true"/>.</para>
  1034. <para><note>Turning the value on leads to more processing time of video frame.</note></para>
  1035. </remarks>
  1036. </member>
  1037. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.KeepObjectsEdges">
  1038. <summary>
  1039. Restore objects edges after noise suppression or not.
  1040. </summary>
  1041. <remarks><para>The value specifies if additional filtering should be done
  1042. to restore objects' edges after noise suppression by applying 3x3 dilatation
  1043. image processing filter.</para>
  1044. <para>Default value is set to <see langword="false"/>.</para>
  1045. <para><note>Turning the value on leads to more processing time of video frame.</note></para>
  1046. </remarks>
  1047. </member>
  1048. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.FramesPerBackgroundUpdate">
  1049. <summary>
  1050. Frames per background update, [1, 50].
  1051. </summary>
  1052. <remarks><para>The value controls the speed of modeled background adaptation to
  1053. scene changes. After each specified amount of frames the background frame is updated
  1054. in the direction to decrease difference with current processing frame.</para>
  1055. <para>Default value is set to <b>2</b>.</para>
  1056. <para><note>The property has effect only in the case if <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MillisecondsPerBackgroundUpdate"/>
  1057. property is set to <b>0</b>. Otherwise it does not have effect and background
  1058. update is managed according to the <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MillisecondsPerBackgroundUpdate"/>
  1059. property settings.</note></para>
  1060. </remarks>
  1061. </member>
  1062. <member name="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.MillisecondsPerBackgroundUpdate">
  1063. <summary>
  1064. Milliseconds per background update, [0, 5000].
  1065. </summary>
  1066. <remarks><para>The value represents alternate way of controlling the speed of modeled
  1067. background adaptation to scene changes. The value sets number of milliseconds, which
  1068. should elapse between two consequent video frames to result in background update
  1069. for one intensity level. For example, if this value is set to 100 milliseconds and
  1070. the amount of time elapsed between two last video frames equals to 350, then background
  1071. frame will be update for 3 intensity levels in the direction to decrease difference
  1072. with current video frame (the remained 50 milliseconds will be added to time difference
  1073. between two next consequent frames, so the accuracy is preserved).</para>
  1074. <para>Unlike background update method controlled using <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.FramesPerBackgroundUpdate"/>
  1075. method, the method guided by this property is not affected by changes
  1076. in frame rates. If, for some reasons, a video source starts to provide delays between
  1077. frames (frame rate drops down), the amount of background update still stays consistent.
  1078. When background update is controlled by this property, it is always possible to estimate
  1079. amount of time required to change, for example, absolutely black background (0 intensity
  1080. values) into absolutely white background (255 intensity values). If value of this
  1081. property is set to 100, then it will take approximately 25.5 seconds for such update
  1082. regardless of frame rate.</para>
  1083. <para><note>Background update controlled by this property is slightly slower then
  1084. background update controlled by <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.FramesPerBackgroundUpdate"/> property,
  1085. so it has a bit greater impact on performance.</note></para>
  1086. <para><note>If this property is set to 0, then corresponding background updating
  1087. method is not used (turned off), but background update guided by
  1088. <see cref="P:AForge.Vision.Motion.SimpleBackgroundModelingDetector.FramesPerBackgroundUpdate"/> property is used.</note></para>
  1089. <para>Default value is set to <b>0</b>.</para>
  1090. </remarks>
  1091. </member>
  1092. </members>
  1093. </doc>