Skip to main content

Open ideas have been reviewed by our Customer Success team and are open for commenting and voting.

4443 Ideas

abnan0001
Supporter
abnan0001Supporter

AI - Objection DetectionArchived

Hi FME InnovatorsI was wondering do we have any ideas/thoughts in future to implement/release any FME transformers and tools in AI Era, Object Detection is evolving alot and Instead of using External API , FME Owned transformer and Labelling.The Era of AI is evolving, I have kept few notes of Object detection for which i was exploring towards it and I was thinking of FME will have its own API , tools and methods in labelling(Training a model within FME) rather than depending on external Api connectors.What is Object Detection?Object Detection is a method in computer vision that detects and identifies objects in an image or video. While image classification predicts a single label for an entire image, object detection finds several objects in a single image, giving each of them a bounding box and a class label.Object detection takes care of two main functions:Localisation Localization - Where is the object? Classification - What is the object? Traditional Machine Learning for Object DetectionBefore the emergence of deep learning, the traditional approach to object detection was handcrafted features and classical ML algorithms. Traditional object detection techniques require you to do manual feature extraction and suffer from problems with too much variation, such as lighting changes, scale changes, and background changes.      Haar Cascades Introduced by Viola and Jones (2001). Utilised for initial face detection (e.g. OpenCV’s face detector). Based on Haar-like features and a cascade of classifiers. Histogram of Oriented Gradients (HOG) + SVM Detect objects by utilising gradient orientations. Popularised by Dalal and Triggs for pedestrian detection. More compact and robust than Haar, but computationally expensive. Selective search + SVM Provides region proposals, which are classified. Helped bridge the gap between traditional machine learning and deep learning. While these machine learning methods set the groundwork, they simply could not outpace both the accuracy and scale of the now deep learning models.Deep Learning for Object DetectionDeep learning has transformed object detection by automating feature extraction via Convolutional Neural Networks (CNNs). Deep learning models automatically learn progressively abstract features from the data, improving speed and accuracy.Two-Stage DetectorsTwo-stage detectors separate the region proposal from classification. R-CNN (Regions with CNN Features) Uses Selective Search to propose region proposals. Uses a CNN to extract features from each proposed region and classify each region. Very accurate, but slow (each region is processed independently). Fast R-CNN This model shared convolutional computation across the image plane. It adds an ROI pooling layer to extract features using shared feature maps. Faster than R-CNN, but still not real-time, close to real-time. Faster R-CNN Introduces a Region Proposal Network for end-to-end training and prediction. Achieves accuracies very close to real-time performance. Single-Stage DetectorsSingle-stage detectors eliminate the need for region proposal and are capable of predicting bounding boxes and class labels directly. YOLO (You Only Look Once) This system is targeted for real-time detection. YOLO divides images into a grid and makes predictions about bounding boxes for each cell in the grid. The versions began with YOLOv3, then to YOLOv4, YOLOv5, and continue to the latest - YOLOv8 (the most recent versions now leverage Transformer-based modifications). SSD (Single Shot MultiBox Detector) SSD uses feature maps from multiple convolutional layers to perform detection. SSD offers a good tradeoff between speed and accuracy. RetinaNet RetinaNet introduced 'Focal Loss', or re-weighted losses, to aid in addressing the issue of class imbalance during training. RetinaNet shows good results across a range of benchmarks.Innovative Architectures and Trends (2025)Modern architectures combine CNNs, Transformers, and self-supervised learning techniques for better generalisation. DETR (Detection Transformer) An end-to-end object detection pipeline that employs Transformers. Negates the need for anchor boxes and Non-Max Suppression (NMS). Very accurate but less computationally efficient than YOLO. Vision Transformers (ViT) Attention mechanism (global feature extraction). Used with a hybrid CNN backbone for efficiency. Self-supervised learning (SSL) Models that are pretrained on unlabeled data (MAE, SimCLR) will transfer better with limited labelled datasets. Tools and FrameworksHere are some popular frameworks for implementing object detection: TensorFlow Object Detection API PyTorch + TorchVision Ultralytics YOLOv8 Detectron2 (by Meta AI) MMDetection Thanks

j.botterill
Influencer
j.botterillInfluencer

SDE file connections as Database Connections not sharable across FME Flow Connection StoreNew

ESRI only makes arcSDE connections available via a ‘sde file’ which is a propierity file stored on disk, sometimes C drive, other times on network storage. Often we struggle with access to these files, as some are open and read only access, others have higher privileges to write to the SDE geodatabase.So its confusing to me having *.sde files as “Database Connections”In 2025, we now have the option to store database connection in FME Flow. We can change the dataset path from local to the Shared Resources on FME, either the Engine or the Data folder.Flow database connection with SDE typeThis enables further re-use of the connection within Flow… however the problem then becomes how can FME authors manage the connection in Form?FME Flow connection storage is great, but not necessarily for SDE file database connectionsIn practice, you go to re-use the flow connection for SDE and despite the path being to Resouces (engine and shared to roles) the error is repeatable  Error’s connecting to feature types in ArcSDE geodb encountered in 2025.1The workaround for now is to follow Option 2 in article https://support.safe.com/hc/en-us/articles/30212601575693-How-to-Create-and-Manage-Esri-Geodatabase-ArcSDE-Connections-in-FME  storing a single SDE file in a network share location that is accessible to both Form and Flow Can Safe software please add an enhancement to help find a better solution surrounding SDE connections and FME?Maybe ESRI have ideas or community wishes to move away from only have *.sde files as the single means to connect to Spatial Database Engine. What's needed is another means/protocol to properly “direct connect” to the DBMS and to include the sde registry.Connect to the DBMS registered for SDE, add a new option to get license and work as SDEA requirement for FME Flow instance is to have ArcServer installed for the licensing of SDE/FGDB. Perhaps this opens up something new.  

andrew_r
Contributor
andrew_rContributor

More options for handling date values on the new Esri ArcGIS Feature Service Reader & WriterGathering Interest

Esri has been adding more options for handling dates.  In geodatabases, they now offer new data types for “Date Only” “Time Only” and “Datetime with timezeone offset” for data in geodatabases.On feature services in ArcGIS Online and ArcGIS Enterprise, publishers can now define the timezone of the data underlying the service, and the timezone for display to clients. This allows service publishers to define how they prefer dates from the service should be displayed, and how dates in data being written to the service should be translated for storage in the underlying dataset.  However, from my rough testing, the Rest services are still sending dates and UNIX values, so the timezone definition on the service is just there so clients (like ArcGIS Pro, ArcGIS Enterprise, etc) know what to do with the unix values on the client side before displaying or writing back. Plus from what I saw at the 2025 user conference, they are adding more datetime configuration options to Pro in future releases.It would be helpful if the new feature service reader could tap into these settings and control how data from date attributes gets pulled into the workspace initially.  It’s just one less thing to have to translate as data comes into the workspace when the creator of the service already defined how they would prefer users to interact with the dates in that service.   On the reader, I could see this as a parameter that has options likeDateTime Output FormatUnix:  Values from Date columns are brought in as the UNIX values from Esri. FME UTC:  Values from date columns are brought in with FME Datetime format and timezones not translatedFME with Timezone from Service:  Values from date columns are brought in with FME Datetime format but with the timezone added. On the writer, I could see a parameter to control how FME date attributes can be translated on writing to the service so they honor the “Time zone of the data” setting.I personally just started looking at migrating from the old ArcGIS Portal reader to the new ArcGIS Feature Service reader.  With the old reader, date info was automatically translated into FME datetime format. With the new reader, date columns initially load in Unix format. That means I’ve now got to do a datetime conversion on data from any reader if I want to work with them on the FME workbench.  But maybe I’m missing something on how best to work with this kind of data in the workbench.  I haven’t found anything yet about this change in any of the documentation or blog articles.  I only found out about it after submitting a ticket. So if anyone has more detail on whether the change is intentional, or how do deal with it now that it’s in the new version, please let me know.

joellejansen
Contributor
joellejansenContributor

Feature Request: Support for Azure DevOps in FME Flow Version ControlNew

We would like to request support for Azure DevOps as a remote Git provider in the Version Control functionality of FME Flow. One of our clients recently upgraded from FME Flow 2024.2.1 to FME Flow 2025.1.2. In the previous version, they were successfully using Azure DevOps Git repositories to manage workspace versioning. After the upgrade, they are no longer able to push changes to their remote repository. The UI reports: “There was a problem communicating with the REST API.”And the backend logs show HTTP 500 errors when attempting to push.According to the documentation, only GitHub.com is officially supported. Azure DevOps is not listed, although it previously worked without issue. This limitation significantly impacts their ability to maintain version history and collaborate effectively. Could you please consider:Adding official support for Azure DevOps Git repositories in FME Flow Version Control. Providing documentation or configuration guidance for Azure DevOps integration. Ensuring compatibility with common enterprise Git platforms beyond GitHub.com.This feature would be highly valuable for organizations using Microsoft and would align FME Flow with broader enterprise DevOps practices. Please let us know if this request will be considered for a future release and if so, in which upcoming release. Thank you for your support! Kind regards,Joëlle Jansen-SoepenbergFME Consultant