Shape the future of FME with your ideas
Open ideas have been reviewed by our Product Management and are open for commenting and voting.
PDF/Word documents have the ability to create form fields for users to fill in. Looking for the ability to auto-populate some or all of these fields from a database record. Ideally, the writer would accept a template .docx or .pdf and allow the field from the database to write to a field in the form.
I can’t see an obvious way to do this already at FME 2024.2, but it would be nice to be able to set the “Out Fields” parameter on ESRI service readers.Often I only need a handful, or even only 1 field from feature classes with enormous schemas. It would be nice if that could be configured in the reader itself seeing as the rest endpoint handles this parameter.Something like this as an example:
Currently, we can only set the default layer color in the DWG Writer using a color index. Various CAD standards also work with layer colors based on RGB values. I would like to see this expanded so that it can be set using both color index and RGB values.This will avoid having to create templates with thousand of layers manually (extensive cad standard).
The ISO-8601 standard for week numbers in the DateTimeConverter is missing the Week 01 - 53 implementation, where week 01 is the one with the first Thursday of the year. This is referred to as %V in other scripting languages. Can this be added as a DateTime function?
I believe this was mentioned in a webinar, but being able to import an endpoint’s schema - from the dataset it is going to be accessing of course - without having to manually create every property would be a great thing to have. This could be from a local file, cloud source, anything. Beyond saving time, it should also ensure accuracy.
Hi FME Server Team,I would like to request an enhancement to the FME Server REST API.API Endpoint:/transformations/jobs/completedEnhancement Request:Currently, it is not possible to filter completed jobs by their finish time using this endpoint. I would like to request support for querying completed jobs by finishTime, in addition to filtering by repository and workspace.Use Case:For monitoring and automation purposes, I need to programmatically retrieve jobs that have successfully finished within the past 5 minutes, filtered by a specific repository and workspace name.Example Query:repository: <repository_name>workspace: <workspace_name>completed successfully (completedState: success)finished within the past 5 minutes (finishTime >= <timestamp>)Having the ability to query by finishTime (ideally with support for both a start and end range, or a relative time based on current time eg. last 5 minutes) would greatly streamline our integration workflows.Thank you for considering this request.
Currently, FME Workbench allows users to define parameters in custom transformers or published workflows, but the ability to provide detailed guidance for each parameter is limited. I propose adding a new feature to User Parameters that enables authors to attach rich descriptions to each parameter. This would include:Formatted text (Markdown or HTML) Web links to documentation, tutorials, or external tools Images or diagrams to illustrate usageWhen a user runs the workspace, they would see an information icon next to each parameter (show it only if it has content). Clicking this icon would open a dedicated window or pane displaying the enhanced description, helping users understand the context, expected input, and any dependencies or external resources required. BenefitsImproves usability and user experience, especially for complex parameters. Reduces support requests by providing self-service guidance. Enables better documentation and onboarding for new users.
FME Flow currently has columns in the Jobs Completed screen like the workspace, source, date/time, engine, etc.What happens for bulk dynamic data replication is you need to run the same fmw multiple times with differing parameters. e.g. a SQL Server to SHP Downloader that runs on a single table at a time.Whilst yes you COULD run the workspace only once (reading hundreds of thousands of features), this not only adversely and unnecessarily affects server resources but means a single error affects all data. Instead, it’s best to run the workspace multiple times (once per SQL Server table) via say a FMEFlowJobSubmitter.In FME Flow this means you get the Jobs screen full of hundreds of the same workspace with a differing date. There’s no way to tell at a glance what parameters caused the 10 out of 200 jobs of the same workspace to fail.I’d like to see a parameter exposed in the Jobs screen so that I can see at a glance that Table ABC and XYZ failed. Not that Workspace123 failed 10 times with unknown parameters. Yes, I’m aware you can click into each job manually and that yes we could setup a log-reader workspace to do this. Or that we could setup a postprocessing task. I have setup such before - but it’d be nice out-of-the-box in the Jobs screen.This could be a standard parameter or a FME Flow parameter that can be set by the author (e.g. linked to a Published Parameter).Thanks.
The 'NoFeaturesTester' custom transformer currently out on FME Hub is powerful. However, we're getting pushback from IT from using it in our production workspaces in FME Server. We've benchmarked their work-arounds, and it slows down the workspace significantly (2 seconds with NFT vs. 2.5 minutes with their suggested alternatives). My idea: Harden this transformer and turn into a standard transformer.
This enhancement will allow full automation of attachment backup workflows using ESRI ArcGIS feature services or geodatabase attachments being backed up to ArcGIS feature services.Currently, even with the ESRI ArcGIS package, you need to manually configure the feature service feature URL on a FeatureReader so that the features’ featureID can be passed to the ArcGISAttachment connector.Implementation would be sinmilar to the Publish Action item of the ArcGISOnlineConnector which exposes the _webservice_url
Hi there,In this article (Configure user attribute mapping with Azure AD SAML Provider – FME Support Center), a group claim can be setup to pass an AD group name which aligns to a role in FME Flow. As a group claim could be based on a search criterion and many AD groups could be returned which is a common method of Enterprise group membership in ArcGIS Enterprise software. I was wondering if this could be used to grant several roles to a user?Looking at controlling access to repositories, who can view workspace and who can run a job, etc via AD group assignment.Thanks
If you have ArcGIS Pro installed on your machine then you have access to the arcpy.geocoding module with its Locator class, which can be serverless. This puts geocoding in the hands of anyone with (say) a file-based locator.
Adding the row number on the interface helps you to avoid errors or forget something.For example, row number to check the amount of conditions that the Tester has.Moreover, highlight the row number if the attribute name already exists.
When I have a job that runs for 5-15 minutes (and this is with a subset of data) it would be nice to be able to sneak a look at the cached features at any point during the running workbench. I could click on the green loading box and see what features have loaded so far. This way I can be QC-ing output before the workbench has even completed running.
using a shift or control (any key combination with left click-drag) to “turn off” autosnap to transformer ports, connection lines, etc while moving one or more transformers into an area where there may be one or more connection lines right where you want to place the new transformer.
Please add Import from Feature Cache (Similar to AttributeFilter) where you can populate the right side or the left side from values. It would also be great to have some multi row copy/paste (drag drop) to be able to easily adjust the mappings withing the mapping editor.
In the current situation + Finland & Sweden joining NATO there is more demand to support Nato Vector Graphics (NVG) -format. Just read a RFP where it’s a must and waiting for more similar RFP’s to come. It’s not just the military people who ask for it, it’s also e.g. border control and who ever is providing services for them.
Hi,Right now it’s impossible to dynamically push a different Headers to the HTTPCaller. You need to already set all the Header names prior to using it.That will be awesome to be able to use, like a Json to push the Headers you want to use, like the Body section.
Currently there is not a connection, transformer or writer that connects FME directly to Azure Key Vault or other key vaults. This would be needed to access sensitive information such as Client IDs and Secrets for Web Connections (to Portal, ACC etc), stored on Key Vault, that we could then incorporate directly into our workbenches. As a company, security is becoming ever more important and this connection would greatly help us to manage sensitive data in line with corporate requirements. We recognise that the existing web connections have decent encryption in place, however there is a general push to have a more standardised approach to storing ids/secrets at a company level.
Hi there,If you click on the log out of FME account menu item, then you see this dialog:There is no way to cancel this. I think it would benefit from a cancel button, in case you change your mind.Thanks,Marc
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK