Apple Inc. is delaying the rollout of tools aimed at combating child pornography on iPhones after sparking concern among privacy advocates that the software could create broader risks for users.

The Cupertino, Calif., tech giant said Friday it would take additional time to make improvements on the plan announced last month—the second time in a year that it has delayed a new privacy feature after an outcry from critics over the potential ramifications.

As part of the latest initiative, the company planned to roll out a system through an iPhone software update later this year that could identify known child-pornography images, then alert Apple if a certain number of those images were uploaded to the company’s cloud storage service known as iCloud.

Apple had vigorously defended its new program as being privacy friendly, arguing that other cloud providers trying to combat exploitative images rely on technology that scans an entirety of a user’s data while Apple had hatched a way to look at only items that were flagged as in violation. Apple’s system wouldn’t flag offending content if it wasn’t uploaded to the cloud.

Privacy experts and critics worried the feature was a signal that the tech giant was softening its posture toward how it protects data via encryption. The company has battled with law enforcement and government officials for years over the ability to harvest encoded data from its devices.

Apple is delaying release of new iPhone software.

Photo: David Paul Morris/Bloomberg News

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said. Apple didn’t address when it could roll out the updates.

Friday’s announcement is the latest of several policy retreats and setbacks for Apple. It has twice in recent weeks announced changes to its App Store rules for software developers in the face of legal and regulatory complaints.

And last year it delayed another feature that it said was aimed at protecting users’ privacy, involving changes to iPhone software intended to give people more say in how their data was being used by apps for advertising purposes.

That delay came after Facebook Inc. and others had vigorously complained that the updates would hinder an ad industry built on user data for personalized messages. Apple eventually released the changes earlier this year.

Alongside the child-pornography detection tool, which Apple unveiled last month, the company had announced another child-safety update in August aimed at helping children and parents spot sexually explicit photos sent in text messages. If an account designated as a child in iCloud Family Sharing receives or prepares to send a sexually explicit photo in the Messages app, the photo would appear blank. In accounts of children 12 and under, parents can opt to receive notifications when a child views or sends such an image.

All of the child-protection measures announced last month are being re-evaluated, Apple said.

In an interview last month, Craig Federighi, Apple’s senior vice president of software engineering, conceded that the announcement of the tools hadn’t gone smoothly. “It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Mr. Federighi said. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

The new initiatives raised concerns in several areas. Some privacy advocates worried a tool capable of scanning users’ content could be adapted for other purposes, such as by governments looking for political speech—something Apple said it would stand against.

Some parents worried their family photos of children in the bathtub, for example, could end up getting flagged as inappropriate. Apple tried to assure that wasn’t the case.

Matthew Green, an associate professor at Johns Hopkins University who was an early critic of Apple’s plans, welcomed the delay but said he remained concerned about the company’s image-scanning algorithm. Cybersecurity researchers last month demonstrated ways that the algorithm Apple uses could misidentify photos.

He said that Apple should follow the lead of other technology companies and only scan shared images, not private ones.

“Don’t scan private photos,” he said. “It would definitely reduce the scope for abuse.”

Write to Joanna Stern at joanna.stern@wsj.com and Tim Higgins at Tim.Higgins@WSJ.com