AIResearch AIResearch
Back to articles
AI

New Algorithms Speed Up Complex Data Optimization

Researchers develop efficient methods to maximize data functions without randomness, achieving near-optimal results with fewer computational steps—key for large-scale machine learning and network analysis.

AI Research
November 05, 2025
2 min read
New Algorithms Speed Up Complex Data Optimization

Optimizing complex data relationships is crucial for applications like social network analysis and recommendation systems, but existing methods often require extensive computational resources or rely on randomness, limiting their reliability and scalability. A new study introduces deterministic algorithms that efficiently maximize a class of functions called non-monotone DR-submodular functions under size constraints, offering faster and more stable solutions for real-world problems.

The researchers discovered two algorithms, FastDrSub and FastDrSub+, that achieve constant-factor approximations with low query complexity. FastDrSub provides a 0.044 approximation ratio using O(n log(k)) queries, while FastDrSub+ improves this to nearly 1/4 - ϵ with similar efficiency. These are the first deterministic methods to combine strong approximation guarantees with near-linear time complexity, outperforming randomized state-of-the-art approaches in practical scenarios.

The methodology involves partitioning the solution space and using dynamic thresholding to guide the selection of elements, ensuring that gains in function value are maximized incrementally. For FastDrSub+, an initial solution from FastDrSub helps estimate an upper bound, followed by a greedy process with a decaying threshold to refine the result. This approach avoids the instability of random methods by systematically building candidate solutions.

Experimental results on Revenue Maximization applications using datasets like Facebook, AstroPh, and Enron show that the algorithms achieve objective values comparable to benchmarks while requiring fewer queries and less running time. For instance, FastDrSub+ ran 1.5 to 2 times faster than randomized methods in some tests, with only slight variations in performance across different parameter settings. The data confirms that these methods maintain high quality even as problem sizes increase, making them suitable for large-scale use.

This work matters because it enables more efficient data optimization in fields like influence propagation and resource allocation, where quick, reliable decisions are essential. By reducing computational demands, the algorithms could help organizations analyze massive networks faster without sacrificing accuracy, potentially improving targeted advertising or fraud detection systems.

Limitations include the focus on size constraints and the need for further research to extend the methods to other constraint types or more complex functions. The paper notes that while the algorithms excel in tested scenarios, their performance in untested real-world applications remains to be fully explored.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn