Police are being encouraged to double their use of retrospective facial recognition (RFR) software to track down offenders over the next six months.
Policing minister Chris Philp has written to force chiefs suggesting the target of exceeding 200,000 searches of still images against the Police National Database using the technology by May.
RFR allows authorities to use facial recognition after an event to establish who a person is or whether their image matches other media held on a database.
Mr Philp is also encouraging police to operate live facial recognition cameras more widely - in a move likely to frustrate some MPs and privacy campaigners, who have branded the use of the software "Orwellian".
Mr Philp said the advances would allow police to "stay one step ahead of criminals" and make Britain's streets safer.
Facial recognition technology uses biometric measures of a person's face and works even if part of their face is covered.
The live form of the software captures footage of crowds and compares it with a watch list of wanted suspects, alerting officers when there is a potential match.
The call for it to be used more widely comes ahead of the government hosting a global artificial intelligence (AI) safety summit this week at Bletchley Park in Buckinghamshire.
Prime Minister Rishi Sunak wants to be seen as a world leader on AI - but the request from Mr Philp is likely to be met with concern by some parliamentarians.
A cross-party group of MPs and peers this month called for an "immediate stop" to the use of live facial recognition surveillance by police and private companies.
Former Brexit secretary David Davis, Liberal Democrats leader Sir Ed Davey, Green MP Caroline Lucas and former Labour shadow attorney general Baroness Shami Chakrabarti were among 65 members of the House of Commons and House of Lords who backed a call for a halt to its deployment.
The joint statement was also backed by 31 groups including Big Brother Watch, Liberty, Amnesty International and the Race Equality Foundation.
Mr Davis called live facial recognition "a suspicionless mass surveillance tool that has no place in Britain".
He added that it has "never been given explicit approval by Parliament".
The Home Office rejects such concerns, with officials saying that facial recognition camera use is strictly governed by data protection, equality, and human rights laws, and can only be used for a policing purpose where it is necessary and proportionate.
The department says AI surveillance methods such as facial recognition can help police accurately identify those wanted for serious crimes, as well assist in finding missing people.
It argues that AI could free up police time and resources, allowing more officers to be based in communities.
Police put up notices in areas where they will be using live facial recognition, the Home Office said.
If the system does not make a match against a watch list, a person's data is deleted immediately and automatically.
It pointed out that live facial recognition technology has already been used successfully, including at last month's Arsenal v Tottenham north London derby at the Emirates Stadium when police caught three wanted suspects, including one for sexual offences.
Another wanted sex offender was identified at the coronation in May and sent back to prison the same day, the department added.
Mr Philp said that "AI technology is a powerful tool for good, with huge opportunities to advance policing and cut crime".
He added: "Facial recognition, including live facial recognition, has a sound legal basis that has been confirmed by the courts and has already enabled a large number of serious criminals to be caught, including for murder and sexual offences."