I have mulling over this for many years ,Uralic and Siberian language user interface Support.Ainu of Japan is only supported by writing roman and rendering into Katakana with a few small modified characters there is no user interface ,spell,grammar checker,dictionary ,translator ,of course the Ainu has few terms in modern vocabulary but Iam studying the language in order to find words and coin new ones, iPhone hoomi-ye-p electric speak thing. I am looking for other peple who have the same idea.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Post
Replies
Boosts
Views
Activity
I am working on an app that pulls data from weatherKit, including the conditionCode property, the content of which is displayed to the user. I wish to localize the data pulled from weatherKit but when pulling data from:
weatherkit.apple.com/api/v1/weather/de/{latitude}/{longitude}
The conditionCode and other strings is in english. Same is true if the language parameter is set to es, ja or something else.
Am I doing something wrong or is localization yet to be supported in weatherKit? I can't find any documentation on this.
Our users are using Apple's native Voice Control feature: https://support.apple.com/en-us/HT210417
We want to enhance our accessibility experience by adding some additional voice controlled dialogs that show up specifically when Voice Control is enabled.
It can be determined if other Apple accessibility features are turned on via a check like UIAccessibility.isVoiceOverRunning, however there is no option for Voice Control (note, different than Voice Over).
How can I detect if a user is running Voice Control or not?
When Full Keyboard access is enabled, the currently focused element is indicated by a thick border (first screenshot below). If the focused element is inside a focus group, e.g. a UIScrollView, then the thick border encloses the entire focus group, and the focused element is indicated by a change in background color instead (second screenshot below). These two types of focus state seem to use the tintColor of the element.
We were advised that the change in background color does not meet WCAG standards since the contrast ratio between the non-focused state and the light blue focused state is not high enough.
Apart from changing the tintColor, is there any other way to customize the focused appearance of an element? It would be ideal if we could apply a border to the focused element even when it's contained in a focus group, rather than just changing the background color.
I'm writing a program that requires accessibility permissions, and I use AXIsProcessTrustedWithOptions to confirm and direct the user to the accessibility interface for authorization. According to conventional implementation, after the accessibility interface is opened, the system will automatically add the corresponding program, and the user only needs to turn on the permissions. But now, with accessibility turned on, I don't see my program loading automatically.
Next, I tried to add it manually, click + on the accessibility interface, select the program I built and then add it. But when I clicked Add, my program was not successfully added. The accessibility interface still did not see my program being loaded, and the user could not open permissions for it.
I think this has nothing to do with whether it is a development version, because I tried other debug programs developed and compiled with xcode and they were able to be added normally. So I want to know, what are the reasons why it cannot be added? Which parts should I check from?
Hi everyone, do you know when real-time subtitles for Italian language calls will be implemented in iPhone?
We recently updated to Xcode 15.1 and start using iOS 17.2 simulator and ran into a blocker issue with out UITests. Setting accessibility identifier on a UIButton with image no longer works. It seems iOS automatically set the label to the file name used for the button and ignore (overwrite) the id and value we set in code. iOS 17.0 with Xcode 15.1 still works. I spent 2 days on this and still cannot find a solution. Anyone had similar issue?
I did file a feedback https://feedbackassistant.apple.com/feedback/13515676
Thanks!
Hello everyone ! I currently experience an issue with the language of our website banners. Despite having translations set in Localizable Information for titles in English, French, German and Italian, the banners aren't adapting to the user's chosen language.
The title always displays in French, while subtitle ("Open in the app") remains in English, irrespective of the user's language preference.
Would anyone know how can I make sure that our banners dynamically translate according to the user's language choice? Is there a setting somewhere in the apple product page that I'm missing? Thanks
Hello,
We can export and save a great summary audit result in HTML by using the Accessibility Inspector. Is there any way we could have the same audit result from the UI tests and integrate it into the CI/CD so that we can have a monthly audit report for our app designer?
Hi, I'm currently registering notifications on numerous AXUIElementRefs. I would like to find a timestamp of when each event occurs, however, I cannot find a reliable way to do so.
Getting a timestamp when the callback is called isn't reliable because the order of callback execution is arbitrary. I know the run loop API is mostly open sourced, and this is a bit of a reach, but is it possible to hook into the CFRunLoopSourceSignal call from the AXObserverRef?
Somewhere in the Apple API stack these notifications are being triggered. My question is, do they record the timestamp and are there any public or private APIs to gather this information?
My goal is to reliably gather in what order certain events happen (e.g. window move, focus, etc.).
I have now finished my app I am learning with for different translations in different localizations and have placed the strings in the StringCatalog. I still have one question about the localization: In my project info I have added the item Localization and added the value German. So far I have told the AppStore that the app language is German. Can I delete this localization item in the info now? Or what does this entry do (see screenshot)?
Due to the different languages in the StringCatalog, I now have the languages of the StringCatalog in the project settings under Localizations. This is always followed by Resources 0 Files Localized. What is this all about?
I am using default AVPlayerviewcontroller with the default playercontrols, skip buttons for video streaming in tvOS app. Custom controls/buttons are not being used.
Can we override the AVPlayerviewcontroller voice over accessibility text/behaviour in for the default player controls? I am not to find out any apple's documentation on this ands not sure even its possible.
I'm experiencing an issue with the Accessibility Inspector. It's worked fine.
Since the update, I'm unable to perform an audit while in the Simulator. I have the following message: "Select a target app to view Accessibility warnings and audit information." However, VoiceOver and Dynamic Type functionalities are working fine.
The audit only works when I test on my iPhone. How can I connect the target for the audit?
Thank you.
Accessibility Inspector V5
Simuator 15.2
I'm using Xcode 15.2 and have migrated my (macOS) project to use an xcstrings file a while back. Now when I check the xcstrings file, all items are marked as "stale". When I add new localized strings in code, they don't show up in the xcstrings file. The xcstrings file is built correctly (into .lproj/Localizable.strings) when building.
Where can I check which source files are checked to update xcstrings status? "xcstringstool" appears to have a "sync" feature which reads "stringsdata" files, but there is no information in the xcstringstool help on where the stringsdata files come from.
If I create a new project I can see a "stringsdata" file being generated for each source file in the intermediate build products folder.
How do I switch off the json view of the project string localization. I've no idea what switched it on in the first place but I want to return to the comfortable string catalogue view.
I want to trigger a "pinch" where I can select whatever it is I'm looking at via pressing spacebar on a bluetooth keyboard that is paired with the apple vision pro.
Is this feasible in Xcode or Unity (VisionOS development) where I not only have access to see native keyboard presses, but also can see the gaze position (or what it's currently highlighting/focusing on) without accessing it through an interaction event.
If keyboard press, then select item being gazed at.
We wont have "pinch" available for people with hand/arm impairments that will trigger an interaction event, and there are an array of accessibility devices that send keyboard commands that would be able to aid in this.
Apple should have this ability in it's native settings, specifically with using keyboard commands as well. This app would demonstrate that ability without it being native yet, and any help would be greatkly appreciated :)
I've been working on migrating some graphics to Swift Charts for an app I work on. However I've been noticing strange behavior when it comes to VoiceOver. If I create a bar chart and use:
BarMark(x: .value("Month", x, unit: .month), y: ...)
While the chart looks fine, the voice over values seem to follow arbitrary values set for the bin. From what I can tell they are following the underlying bin values that Swift Charts uses to provide spacing between bars.
For instance, this simple example:
let monthlyRevenueData = [
(x: try! Date("2024-01-01T00:00:00Z", strategy: .iso8601), y: (income: 55000, revenue: 124000), id: UUID()),
(x: try! Date("2024-02-01T00:00:00Z", strategy: .iso8601), y: (income: 58000, revenue: 130000), id: UUID()),
(x: try! Date("2024-03-01T00:00:00Z", strategy: .iso8601), y: (income: 59000, revenue: 120000), id: UUID()),
]
struct ContentView: View {
var body: some View {
Chart(monthlyRevenueData, id: \.id) { (x, y, _) in
BarMark(x: .value("Month", x, unit: .month), y: .value("Income", y.income))
.foregroundStyle(.green)
BarMark(x: .value("Month", x, unit: .month), y: .value("Revenue", y.revenue))
}
}
}
#Preview {
ContentView()
}
Results in the Voice Over reading "January 14th 2024 at 12 AM to January 28th 2024 at 12am ..." despite the fact that the data should be for the entire month of Jan.
Is there any way to get VoiceOver to read the input data rather than relying on how the chart is formatted? Preferably without the need to remove all visual spacing between the bars.
Video link: https://drive.google.com/file/d/11mxCl3wR2HzoOaihOvci-vZk4zgG1d39/view?usp=drive_link
Hi,
I'm parsing iOS localization files and during tests with xcode 15 i noticed new lines appear in the xcloc files with LS instead of the usual LF i was used to.
Questions:
is this the default behavior in xcode 15? Has this changed with this version?
is this controllable by any settings?
Disclaimer: not a iOS developer here, please pardon any confusions and have patience.
Universal Control works to share keyboard and mouse from one mac to another, but actually switching seems to require physically moving the mouse to the other display.
Ideally, I'd like Apple to support some command available via command-key equivalents to cycle to other devices in the way that we can cycle through applications or windows.
Seeking to program that directly, I was unable to find any Universal Control API's on point. Are there any? I can imagine restricted this to the OS only for security.
In case there are display-driven API's, I see that system settings/displays shows all the displays from all devices, but I was unable to find a UI or API's to change focus to another display (other than moving the mouse to select the display). When I list displays programmatically, I only see the device-local displays.
In case there are device-driven APIs: I can initialize a bluetooth session and secure credentials, but interaction seems to be unavailable.
My app speaks and offers the user the ability to use their Personal Voice - but if the user says "No" when I first ask them for permission to use their Personal Voice, I can never ask them again.
So if they change their mind later - they have to go to Settings->Accessibility->Personal Voice to toggle the permission for my app. To make things easier for them, I would like to be able to pop open that page for them - but I don't know how to create the URL for it. Is it even possible?
The closest I've been able to get is to open the settings page for my app (Settings->My App Settings) with:
guard let settingsUrl = URL(string: UIApplication.openSettingsURLString) else {return }
if UIApplication.shared.canOpenURL(settingsUrl) {
UIApplication.shared.open(settingsUrl)
}
Thanks in advance!